Human Generated Data

Title

[New York World's Fair exhibit of trains]

Date

1940

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.527.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[New York World's Fair exhibit of trains]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.527.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.1
Person 99.1
Person 98.9
Person 98.6
Person 95.4
Person 94.1
Person 91.9
Transportation 87.8
Meal 85.6
Food 85.6
Vehicle 84.7
Rail 80.1
Train Track 80.1
Railway 80.1
Outdoors 70.6
Nature 69.9
Person 68.9
People 67.4
Person 66.8
Train 65.4
Crowd 63.7
Architecture 62.1
Building 62.1
Photo 60.1
Photography 60.1
Pedestrian 59.8
Face 59.3
Water 57.6
Urban 55.1

Clarifai
created on 2019-11-19

people 99.9
group together 99.7
vehicle 98.7
many 98.6
group 98.2
adult 97.1
transportation system 96
aircraft 95.4
man 94
child 93.1
outfit 93
military 92.5
uniform 91.6
baseball 90.7
athlete 89.6
woman 89.5
spectator 89.1
one 88.6
several 88.4
airplane 87.5

Imagga
created on 2019-11-19

structure 30.8
hut 24.5
sky 19.8
stage 19.4
building 18.8
shelter 18.8
old 18.8
platform 16.7
vehicle 15.7
travel 15.5
architecture 14.9
house 14.5
street 12.9
tourist 12.9
power 12.6
transportation 11.6
summer 10.3
city 10
tourism 9.9
person 9.9
urban 9.6
car 9.3
museum 9.3
traditional 9.1
danger 9.1
industrial 9.1
destruction 8.8
wheeled vehicle 8.7
housing 8.5
people 8.4
dark 8.3
transport 8.2
road 8.1
roof 8
history 8
home 8
art 8
business 7.9
world 7.8
black 7.8
construction 7.7
tree 7.7
industry 7.7
outdoor 7.6
landscape 7.4
vacation 7.4
water 7.3
truck 7.3
ship 7.1
trailer 7
modern 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Male, 50.2%
Happy 49.2%
Angry 45.8%
Surprised 45.8%
Disgusted 45.1%
Sad 45.7%
Confused 45.1%
Calm 48%
Fear 45.3%

AWS Rekognition

Age 66-80
Gender Male, 50.3%
Fear 49.5%
Calm 49.6%
Disgusted 49.5%
Angry 49.6%
Surprised 50.2%
Confused 49.5%
Sad 49.5%
Happy 49.5%

Feature analysis

Amazon

Person 99.1%

Categories

Imagga

cars vehicles 80.1%
paintings art 17.2%