Human Generated Data

Title

[Train engine]

Date

1929-1931

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.226.22

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Train engine]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1929-1931

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.7
Person 99.7
Flooring 82
Transportation 78.3
Vehicle 78.3
Train 75.8
Astronaut 57.3

Clarifai
created on 2019-11-19

people 98.6
monochrome 98.6
transportation system 98.1
vehicle 98
street 95.7
military 95.5
war 94.1
group together 93.1
watercraft 90.4
aircraft 90.3
light 89.2
dark 87.6
ship 87.1
black and white 87.1
skirmish 86.8
city 86.3
group 85.9
boat 85.7
adult 84.2
no person 82.9

Imagga
created on 2019-11-19

ship 49.7
vehicle 43.8
vessel 35.5
wheeled vehicle 26.6
boat 25.1
military vehicle 24.3
sea 21.1
wreckage 21
sky 20.5
wreck 20
shipwreck 18.8
craft 18.2
transportation 17.9
car 17.9
part 17.5
old 17.4
water 17.4
port 17.3
tank 17
steel 16.8
warship 15.5
industry 14.5
harbor 14.4
city 14.1
aircraft carrier 14.1
truck 14.1
architecture 14.1
freight car 14
travel 13.4
ocean 13.3
structure 12.8
industrial 12.7
power 12.6
motor vehicle 12.4
building 12.2
landscape 11.9
transport 11.9
conveyance 11.8
pier 11.7
trailer 11.3
cloud 11.2
abandoned 10.7
tourism 10.7
rust 10.6
urban 10.5
dock 9.7
metal 9.7
factory 9.7
rusty 9.5
construction 9.4
mobile home 9.4
road 9
coast 9
tower 9
river 8.9
cargo 8.7
station 8.7
container 8.7
broken 8.7
tracked vehicle 8.7
skyline 8.5
summer 8.4
house 8.4
housing 8.4
outdoors 8.2
history 8.1
nautical 7.8
boats 7.8
fishing 7.7
clouds 7.6
cityscape 7.6
horizontal 7.5
wood 7.5
armored vehicle 7.5
sunset 7.2
garbage truck 7.1
wooden 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 95.8
ship 95.4
black and white 90
white 78.1
black 77.5
monochrome 54

Face analysis

Amazon

AWS Rekognition

Age 39-57
Gender Male, 50.2%
Sad 50.4%
Disgusted 49.5%
Confused 49.5%
Angry 49.5%
Fear 49.5%
Happy 49.5%
Calm 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 99.7%
Train 75.8%

Captions

Microsoft

a black and white photo of a person 45.4%
a black and white photo 45.3%
a black and white photo of a boat 25.4%