Human Generated Data

Title

[Looking down at people walking on train track]

Date

1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.349.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Looking down at people walking on train track]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.5
Human 99.5
Person 98.5
Person 97.8
Person 97.5
Person 94.3
Person 94.2
Person 93.2
Building 85.2
Person 82.2
Person 80.2
Person 80.1
Person 75.6
Amusement Park 70.7
Theme Park 70.7
Person 70.5
Person 70.5
Urban 68.4
People 68.3
Person 64.2
Train Track 60.7
Transportation 60.7
Railway 60.7
Rail 60.7
Roller Coaster 58.1
Coaster 58.1
Person 52.7

Clarifai
created on 2019-05-29

people 100
group 99.4
group together 99
adult 98.9
one 97.2
many 95.9
wear 95.8
man 95.6
child 95.1
several 92.9
war 92.7
two 92.7
vehicle 91.9
transportation system 91.5
three 90.3
military 89.8
four 89.7
soldier 88.9
woman 88.2
railway 85.3

Imagga
created on 2019-05-29

stretcher 22.4
brace 21
old 20.2
prison 19.9
litter 17.9
people 16.2
strengthener 15.7
travel 15.5
person 15.4
device 15.4
portrait 14.9
stone 14.3
city 14.1
conveyance 14
crosspiece 13.9
ancient 13.8
wall 13.7
correctional institution 13.6
history 13.4
architecture 13.3
tie 12.8
kin 12.6
man 12.1
adult 11.6
traditional 11.6
black 11.6
sculpture 11.5
child 11.1
building 11.1
male 10.9
penal institution 10.2
tourism 9.9
religion 9.9
support 9.7
urban 9.6
historical 9.4
culture 9.4
dirty 9
step 9
one 9
style 8.9
statue 8.8
monument 8.4
house 8.4
fashion 8.3
street 8.3
historic 8.2
soldier 7.8
military 7.7
death 7.7
war 7.7
brick 7.7
outdoor 7.6
outdoors 7.5
tourist 7.4
vacation 7.4
lady 7.3
landmark 7.2
cell 7.2
world 7.2
art 7.1
face 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

outdoor 96.9
black and white 96.5
clothing 96.5
person 94.5
toddler 93.4
child 84.5
baby 84.3
footwear 78.5
boy 75.5
monochrome 70.8

Face analysis

Amazon

AWS Rekognition

Age 27-44
Gender Female, 53.4%
Sad 53.9%
Happy 45.1%
Angry 45.3%
Confused 45.1%
Surprised 45.1%
Disgusted 45%
Calm 45.4%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people sitting on a bench 64.5%
a group of people wearing costumes 64.4%
a group of people posing for a photo 64.3%