Human Generated Data

Title

Day Room

Date

1967-1969

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of The Mr. and Mrs. Stanley Marcus Foundation, P1972.30

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

Day Room

People

Artist: Danny Lyon, American born 1942

Date

1967-1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of The Mr. and Mrs. Stanley Marcus Foundation, P1972.30

Copyright

© Danny Lyon/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Person 99.6
Human 99.6
Person 99.4
Person 99.2
Person 99.1
Sitting 99
Person 91.6
Person 90.5
Furniture 86.7
Person 86.4
Indoors 83.7
Room 83.7
Flooring 81
Person 80.4
Person 69.9
Person 69.3
Person 69.2
Chair 66.7
Leisure Activities 62.8
Finger 61.9
Floor 61.7
Senior Citizen 61.2
Waiting Room 58.5
Restaurant 58.2
Cafeteria 58.2
Couch 57.9

Clarifai
created on 2018-02-09

people 99.9
group together 98.9
group 98.9
man 98.3
adult 96.3
many 96.1
woman 95.8
recreation 95.1
several 91.1
sitting 90.9
indoors 90.2
crowd 88.9
administration 88.8
child 86.7
sit 85.7
wear 85.3
one 82.9
spectator 82.2
audience 81.2
furniture 81

Imagga
created on 2018-02-09

man 43
people 36.8
male 32.6
senior 30
person 29.2
happy 25.7
adult 24
spectator 21.6
old 21.6
sitting 21.5
portrait 20.7
couple 20
men 19.7
smiling 19.5
room 19.4
business 18.8
office 18
group 17.7
businessman 17.6
retired 17.4
mature 16.7
indoors 16.7
together 16.6
retirement 15.4
elderly 15.3
home 15.1
looking 14.4
table 13.8
casual 13.5
women 13.4
teacher 13.3
smile 12.8
indoor 12.8
team 12.5
education 12.1
handsome 11.6
executive 11.6
classroom 11.5
enjoying 11.4
friends 11.3
corporate 11.2
camera 11.1
love 11
work 11
happiness 11
lifestyle 10.8
black 10.8
hand 10.6
computer 10.4
desk 10.4
two 10.2
grandfather 10.1
holding 9.9
worker 9.8
human 9.7
lady 9.7
class 9.6
chair 9.6
face 9.2
professional 9.1
sixties 8.8
60s 8.8
student 8.5
adults 8.5
grandma 8.4
modern 8.4
laptop 8.3
color 8.3
leisure 8.3
aged 8.1
board 8.1
cheerful 8.1
school 8.1
barbershop 8.1
pensioner 8
working 8
restaurant 7.9
seniors 7.9
jacket 7.8
older 7.8
studying 7.7
married 7.7
age 7.6
businesspeople 7.6
sit 7.6
horizontal 7.5
meeting 7.5
relaxed 7.5
fun 7.5
outdoors 7.5
teamwork 7.4
glasses 7.4
shop 7.2
family 7.1
entrepreneur 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 99.1
crowd 4.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 99.9%
Angry 5.7%
Sad 14.7%
Happy 1.3%
Surprised 2.3%
Confused 10%
Disgusted 1.7%
Calm 64.4%

AWS Rekognition

Age 45-66
Gender Male, 94.8%
Confused 3.8%
Angry 6.2%
Surprised 1.6%
Calm 44.9%
Happy 0.9%
Disgusted 1.2%
Sad 41.4%

AWS Rekognition

Age 45-63
Gender Female, 79.6%
Happy 6.9%
Disgusted 7.7%
Calm 19.9%
Surprised 9.5%
Sad 22.1%
Confused 14.4%
Angry 19.6%

AWS Rekognition

Age 20-38
Gender Male, 91.1%
Happy 0.5%
Disgusted 0.3%
Angry 1.2%
Surprised 0.6%
Sad 2.7%
Calm 93.9%
Confused 0.7%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%