Human Generated Data

Title

Foyer of an Apartment

Date

1965

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2014.497

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

Foyer of an Apartment

People

Artist: Danny Lyon, American born 1942

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2014.497

Copyright

© Danny Lyon/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 99.9
Person 99.9
Clothing 99.8
Apparel 99.8
Person 99.7
Person 99.7
Person 98.4
Shorts 97.8
Sleeve 81.4
Person 76.9
Female 73.2
Pants 69.6
Undershirt 63.9
Coat 63.8
Overcoat 63.8
Woman 60.2
Long Sleeve 55.3
Door 52

Clarifai
created on 2018-02-09

people 99.9
woman 98.9
adult 98.6
street 97.8
group 97.5
man 97.2
monochrome 96.7
child 94.7
two 94.6
four 93.7
group together 93.2
three 92.1
administration 89.1
wear 88.4
home 86.1
five 85.5
boy 84.1
family 82.3
room 82.3
offspring 80.7

Imagga
created on 2018-02-09

man 39
male 29.9
people 27.9
business 27.3
businessman 23
adult 20.7
person 19.9
men 19.8
office 19
corporate 18.9
executive 16.4
barbershop 14.6
world 14.4
suit 13.8
businesswoman 13.6
shop 13.5
work 13.4
couple 13.1
sitting 12.9
kin 12.7
professional 12.2
casual 11.9
job 11.5
life 11.4
indoors 11.4
looking 11.2
building 11.2
room 11
two 11
indoor 11
happy 10.7
working 10.6
fashion 10.6
career 10.4
meeting 10.4
communication 10.1
laptop 10
city 10
silhouette 9.9
black 9.7
employee 9.6
women 9.5
businesspeople 9.5
love 9.5
lifestyle 9.4
window 9.3
mature 9.3
portrait 9.1
group 8.9
computer 8.8
together 8.8
standing 8.7
newspaper 8.5
face 8.5
travel 8.5
manager 8.4
old 8.4
occupation 8.2
success 8
handsome 8
family 8
device 7.9
smile 7.8
happiness 7.8
scholar 7.8
spectator 7.8
worker 7.7
pretty 7.7
attractive 7.7
youth 7.7
finance 7.6
human 7.5
teamwork 7.4
alone 7.3
smiling 7.2
chair 7.2
team 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 99.6
black 78.4
people 72.5
white 61.4
subway 10.5
crowd 0.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 17-27
Gender Male, 98.2%
Disgusted 1.2%
Happy 1.2%
Surprised 3.8%
Angry 2.4%
Calm 81.6%
Sad 4.2%
Confused 5.6%

AWS Rekognition

Age 14-25
Gender Female, 97.7%
Happy 3.9%
Disgusted 11.4%
Surprised 9.8%
Confused 7.5%
Calm 12.4%
Sad 26.7%
Angry 28.2%

AWS Rekognition

Age 23-38
Gender Female, 86.9%
Happy 3.3%
Sad 71.5%
Disgusted 7.9%
Surprised 0.8%
Calm 10.9%
Confused 2%
Angry 3.4%

AWS Rekognition

Age 26-43
Gender Female, 99.1%
Disgusted 0.8%
Sad 2.1%
Calm 1.4%
Surprised 3%
Happy 89.8%
Confused 1.7%
Angry 1.2%

Microsoft Cognitive Services

Age 40
Gender Male

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Door 52%