Human Generated Data

Title

Racer, Schererville, Indiana

Date

1965, printed 2006

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Doug and Joan Hansen, 2009.220

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

Racer, Schererville, Indiana

People

Artist: Danny Lyon, American born 1942

Date

1965, printed 2006

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Doug and Joan Hansen, 2009.220

Copyright

© Danny Lyon/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Helmet 100
Apparel 100
Clothing 100
Human 98.9
Person 98.9
Person 96.3
Crash Helmet 83
Face 82.8
Hardhat 77
Photography 71.7
Photo 71.7
Portrait 71.7
Vehicle 68.7
Transportation 68.7
Mud 67.9
Bike 65.8
Bicycle 65.8
Car 64.4
Automobile 64.4
Bicycle 63.2
Plant 60.3
Man 58.7
Soldier 57.9
Military Uniform 57.9
Military 57.9
Skin 56.8
Soil 56.8
Glasses 56.6
Accessory 56.6
Accessories 56.6

Clarifai
created on 2018-03-22

people 99.9
one 99.7
adult 99.3
portrait 95.9
man 95.8
wear 95.5
military 95
two 92.8
vehicle 91.8
soldier 89.7
group 89.4
veil 89.3
group together 88.5
war 88.3
recreation 87.3
outfit 85.4
uniform 84.2
military uniform 82.9
administration 82.7
weapon 81.8

Imagga
created on 2018-03-22

cemetery 30.6
graffito 23.4
fountain 21.8
structure 21.6
decoration 19.6
snow 17.1
art 14.5
architecture 14.2
sketch 13.7
mask 13.3
winter 12.8
drawing 12.6
sky 12.1
park 12.1
black 11.8
religion 11.7
tree 11.5
travel 11.3
cold 11.2
statue 11.2
landscape 11.2
old 11.1
man 10.8
outdoor 10.7
building 10.6
monument 10.3
culture 10.3
light 10
dark 10
water 10
city 10
face 9.9
ancient 9.5
sitting 9.4
disguise 9.4
season 9.4
person 9.1
sculpture 8.9
portrait 8.4
church 8.3
figure 8.2
outdoors 8.2
stone 8.2
history 8
representation 8
night 8
male 7.9
artistic 7.8
mysterious 7.8
museum 7.8
scary 7.7
golden 7.7
god 7.7
house 7.6
covering 7.6
vintage 7.4
historic 7.3
wall 7.2
dirty 7.2
river 7.1
trees 7.1

Google
created on 2018-03-22

Microsoft
created on 2018-03-22

outdoor 99.7
text 95.4
grass 95.1
person 92.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 35-52
Gender Female, 57.6%
Disgusted 12.5%
Surprised 5%
Angry 9.7%
Confused 20.9%
Calm 34.5%
Sad 15.7%
Happy 1.7%

Microsoft Cognitive Services

Age 80
Gender Male

Feature analysis

Amazon

Helmet 100%
Person 98.9%
Bicycle 65.8%
Car 64.4%

Captions