Human Generated Data

Title

[New York World's Fair exhibit of trains]

Date

1940

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.527.17

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[New York World's Fair exhibit of trains]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 92.5
Human 92.5
Boat 89.3
Vehicle 89.3
Transportation 89.3
Person 87
Person 62.3
Train 57.4
People 55.4
Architecture 55.1
Building 55.1

Clarifai
created on 2019-11-19

transportation system 98
vehicle 98
no person 94.7
aircraft 92.6
travel 89.4
watercraft 88.3
military 86.8
airplane 85.6
ship 84.6
outdoors 83.6
people 83
monochrome 75.9
retro 74.6
one 74.1
engine 74.1
airport 73.7
war 72.4
business 72.1
modern 71.8
old 71.4

Imagga
created on 2019-11-19

architecture 35.5
building 34.9
sky 34.9
structure 29.7
old 23.7
travel 22.5
mobile home 19.8
wall 19.4
city 18.3
trailer 17.4
tourism 17.3
house 16.8
roof 16.8
housing 16.6
historic 16.5
wheeled vehicle 16.4
construction 16.3
landscape 15.6
vehicle 15.1
clouds 14.4
ancient 13.8
ship 13.6
window 13.6
tower 13.5
historical 13.2
church 12.9
stone 12.7
history 12.5
urban 12.2
water 12
exterior 12
sea 11.7
religion 11.6
snow 11.2
device 11
landmark 10.8
beam 10.4
town 10.2
street 10.1
ocean 10.1
transportation 9.9
solar dish 9.7
liner 9.7
vessel 9.4
winter 9.4
boat 9.3
home 8.8
craft 8.4
outdoor 8.4
coast 8.1
temple 8.1
rural 7.9
wooden 7.9
summer 7.7
reflector 7.7
port 7.7
harbor 7.7
brick 7.7
industry 7.7
outdoors 7.5
famous 7.4
vacation 7.4
transport 7.3
column 7.3
metal 7.2
detail 7.2
mountain 7.2
door 7.1
scenic 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

outdoor 93.9
black and white 91.3
text 90.5
ship 89
vehicle 72.5
monochrome 66.5
old 41.8

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Male, 50.2%
Confused 49.5%
Disgusted 49.5%
Fear 49.6%
Happy 49.5%
Angry 49.6%
Surprised 49.5%
Calm 50%
Sad 49.7%

Feature analysis

Amazon

Person 92.5%
Boat 89.3%

Captions

Microsoft

a large white building 61.3%

Text analysis

Amazon

le pead

Google

be
Y
pe
be pe Y