Human Generated Data

Title

Children at an Apartment Entrance

Date

1965

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2014.496

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

Children at an Apartment Entrance

People

Artist: Danny Lyon, American born 1942

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Human 99.9
Person 99.9
Person 99.9
Person 99.8
Person 99.4
Person 99.3
Person 99.2
Apparel 99.2
Clothing 99.2
Footwear 99
Shoe 99
Person 98.7
Shorts 98
Shoe 96.8
Sitting 91.8
People 87.4
Shoe 81.8
Pants 76.6
Shoe 74.4
Sleeve 55.8

Clarifai
created on 2018-03-22

people 99.9
group 99.3
child 99
group together 98.6
adult 94.8
many 94.6
four 94.3
five 93.4
woman 93.3
family 93
several 92.8
man 92.3
sit 91.3
offspring 91.1
portrait 90.1
sibling 89.3
three 88.9
home 87.7
son 86.9
street 86.7

Imagga
created on 2018-03-22

kin 100
man 18.1
world 17.7
child 17.1
old 16.7
people 15.6
statue 15.3
sculpture 15.3
ancient 14.7
male 14.4
architecture 13.3
parent 12.7
mother 12.6
history 12.5
person 12.5
city 12.5
monument 12.1
art 11.7
vintage 11.6
building 11.5
grunge 11.1
portrait 11
religion 10.8
wall 10.3
stone 10.2
aged 10
historical 9.4
culture 9.4
face 9.2
historic 9.2
danger 9.1
adult 9.1
dirty 9
soldier 8.8
military 8.7
men 8.6
travel 8.4
religious 8.4
black 8.4
dad 8.2
family 8
juvenile 7.9
business 7.9
antique 7.8
father 7.8
mask 7.7
human 7.5
street 7.4
decoration 7.2

Google
created on 2018-03-22

Microsoft
created on 2018-03-22

building 100
outdoor 99.9
person 99.9
sitting 97.9
posing 74.4
people 72.8
group 62.4
old 55.1
curb 23

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-18
Gender Male, 95.2%
Disgusted 3.2%
Confused 6.8%
Happy 2.7%
Angry 2.5%
Calm 77.2%
Surprised 3.2%
Sad 4.4%

AWS Rekognition

Age 16-27
Gender Male, 91.7%
Happy 2.2%
Sad 14.7%
Calm 16.5%
Confused 28.6%
Disgusted 10.8%
Surprised 2.3%
Angry 24.8%

AWS Rekognition

Age 4-9
Gender Male, 94.8%
Angry 3.9%
Disgusted 0.9%
Sad 18.1%
Calm 68.1%
Surprised 0.7%
Happy 1.6%
Confused 6.7%

AWS Rekognition

Age 10-15
Gender Male, 97.2%
Surprised 2.8%
Sad 5%
Disgusted 4%
Calm 4.5%
Happy 74.7%
Angry 3%
Confused 6%

AWS Rekognition

Age 10-15
Gender Male, 79.5%
Angry 15.4%
Happy 5%
Calm 40.2%
Sad 31.5%
Confused 4.3%
Surprised 1.4%
Disgusted 2.3%

AWS Rekognition

Age 10-15
Gender Female, 90.8%
Calm 29.1%
Surprised 6.4%
Disgusted 2.6%
Angry 5%
Sad 20.5%
Confused 10.7%
Happy 25.6%

AWS Rekognition

Age 1-5
Gender Female, 87.7%
Surprised 2%
Disgusted 2%
Confused 2.6%
Happy 0.9%
Calm 17.4%
Sad 68.6%
Angry 6.5%

AWS Rekognition

Age 26-43
Gender Female, 68.9%
Calm 56.6%
Angry 4.6%
Confused 5.3%
Disgusted 3%
Surprised 1.9%
Happy 10.5%
Sad 18%

Microsoft Cognitive Services

Age 26
Gender Male

Microsoft Cognitive Services

Age 12
Gender Female

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 45
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Shoe 99%

Captions

Microsoft

a group of people sitting on a bench posing for the camera 96.2%
a group of people sitting on a bench posing for a photo 95.4%
a group of people sitting on a bench in front of a building 94.2%

Text analysis

Amazon

URNISHED
ENTS
TON
ENT
TON BLDG.
ENT RO0MS
BLDG.
RO0MS

Google

BLDG
ENT
TON BLDG URNISHED ENTS ENT
TON
URNISHED
ENTS