Human Generated Data

Title

[Train passing under bridge]

Date

1929-1931

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.229.19

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Train passing under bridge]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1929-1931

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Machine 97.5
Wheel 97.5
Person 96
Human 96
Wheel 95.7
Wheel 95.1
Person 92.8
Person 92.7
Building 84.6
Person 83.6
Outdoors 75.8
Wheel 74.1
Person 73.2
Nature 73.1
Vehicle 71.7
Transportation 71.7
Bicycle 71.7
Bike 71.7
Wheel 68.7
Person 63.9
Urban 63.3
People 62
Person 61.3

Clarifai
created on 2019-11-19

people 99.7
vehicle 98.9
group together 98.8
many 98.7
group 98.2
transportation system 97.5
war 94.2
military 93.3
railway 92.1
adult 90.4
man 89.9
monochrome 89.8
no person 88.9
street 87.4
crowd 85.9
watercraft 85.9
train 81.4
administration 80.1
skirmish 79.4
aircraft 79.3

Imagga
created on 2019-11-19

cannon 64.2
gun 46.6
weapon 36.6
architecture 36.1
city 34.9
building 30
cityscape 23.7
urban 23.6
structure 22.2
old 20.9
travel 19.7
landscape 17.8
sky 17.2
tower 17
industrial 16.3
construction 16.3
landmark 16.2
buildings 15.1
town 14.8
aerial 14.6
river 14.2
tourism 14
industry 12.8
steel 12.5
history 12.5
vintage 12.4
factory 12
ship 10.5
culture 10.3
famous 10.2
historic 10.1
roof 9.9
pollution 9.6
church 9.2
transportation 9
vessel 8.9
device 8.8
house 8.7
scene 8.7
ancient 8.6
panorama 8.6
grunge 8.5
modern 8.4
monument 8.4
iron 8.4
palace 8.3
exterior 8.3
street 8.3
environment 8.2
tourist 8.2
night 8
bridge 7.9
black 7.8
antique 7.8
skyline 7.6
frame 7.5
park 7.4
light 7.4
transport 7.3
equipment 7.3
metal 7.2
column 7.1
work 7.1
artillery 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 96.6
outdoor 90.6
black and white 90
factory 71.1
black 66.1
monochrome 61.5
building 54.8

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Female, 50.3%
Angry 49.5%
Sad 50.5%
Disgusted 49.5%
Happy 49.5%
Fear 49.5%
Surprised 49.5%
Confused 49.5%
Calm 49.5%

Feature analysis

Amazon

Wheel 97.5%
Person 96%
Bicycle 71.7%

Captions

Microsoft

a black and white photo of a building 79.1%
a black and white photo of a person 65%
a black and white photo of a store 64.9%

Text analysis

Google

91 11
91
11