Human Generated Data

Title

[Ship model]

Date

1950s?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.336.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Ship model]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950s?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.336.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 82.4
Apparel 64.4
Clothing 64.4
Furniture 63.7
Person 63.7
People 61.8
Silhouette 60.4
Vehicle 56.7
Transportation 56.7
Train 56.7

Clarifai
created on 2019-05-29

people 100
group 99
vehicle 98.8
adult 98.8
group together 98.6
man 97.1
transportation system 96.5
one 94.9
watercraft 93.7
many 93.3
two 92.4
woman 91.3
train 91.1
child 91
furniture 90.8
three 89.7
war 89.7
administration 87.8
music 87.8
military 87.7

Imagga
created on 2019-05-29

man 25
business 18.2
work 18
architecture 17.9
people 17.8
building 17.7
male 17.7
city 16.6
case 16.2
window 15.9
men 14.6
industry 14.5
job 14.1
steel 14.1
shop 14.1
passenger 14.1
modern 14
urban 14
black 13.8
light 13.4
office 13.3
interior 13.3
counter 12.9
sitting 12.9
industrial 12.7
worker 12.7
indoors 12.3
person 12.3
old 11.8
house 11.7
windowsill 11.3
metal 11.3
home 11.2
construction 11.1
chair 10.9
laptop 10.6
corporate 10.3
glass 10.1
indoor 10
alone 10
room 9.9
factory 9.8
machine 9.7
businessman 9.7
inside 9.2
occupation 9.2
mercantile establishment 9
sill 9
adult 8.4
fire 8.4
safety 8.3
silhouette 8.3
computer 8.1
device 8.1
working 7.9
lifestyle 7.9
airport 7.8
barbershop 7.8
meeting 7.5
relaxation 7.5
flame 7.5
equipment 7.4
furniture 7.1
television 7.1
travel 7
life 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

old 93.9
window 85.5
black and white 81.4
black 79.8
white 76.7
person 71.7
clothing 61.5
vintage 26.8

Color Analysis

Feature analysis

Amazon

Person 63.7%
Train 56.7%

Categories

Imagga

interior objects 100%

Text analysis

Amazon

MAD