Human Generated Data

Title

Ronnie and Cheri, La Porte, Indiana

Date

1962, printed 2006

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Doug and Joan Hansen, 2010.20

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

Ronnie and Cheri, La Porte, Indiana

People

Artist: Danny Lyon, American born 1942

Date

1962, printed 2006

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Person 99.7
Human 99.7
Person 99.7
Restaurant 98.9
Sitting 89.7
Cafe 88.3
Cafeteria 82.6
Coffee Cup 80.6
Cup 80.6
Food 73.6
Meal 70.7
Person 70.1
Pottery 64.9
Food Court 59.7
Dish 58.4
Finger 57.7

Clarifai
created on 2018-03-22

people 99.6
adult 99.2
man 98.1
sit 95.6
facial expression 94.5
woman 94.4
two 93.3
one 92.7
group 90.2
recreation 89
wear 88.6
furniture 87.9
group together 86.7
portrait 84
three 83
monochrome 81.9
sitting 80.8
four 80.5
administration 79.2
enjoyment 76

Imagga
created on 2018-03-22

senior 51.5
man 42.3
person 39.6
male 36.2
mature 35.3
elderly 34.4
people 32.3
couple 29.6
home 28.7
adult 28.2
old 27.2
retired 27.1
indoors 25.5
happy 25.1
together 24.5
smiling 23.9
sitting 22.3
retirement 22.1
lifestyle 20.9
grandma 20.8
glasses 19.4
older 19.4
washboard 18.2
husband 17.2
family 16.9
grandfather 16.4
wife 16.1
happiness 15.7
table 15.6
device 15.5
aged 15.4
smile 15
grandmother 14.7
speaker 14.7
casual 14.4
active 13.9
men 13.7
office 13.6
portrait 13.6
leisure 13.3
cheerful 13
laptop 12.9
business 12.7
love 12.6
articulator 12
women 11.9
day 11.8
mother 11.4
enjoying 11.4
computer 11.3
looking 11.2
pensioner 11.2
60s 10.7
holding 10.7
face 10.6
working 10.6
married 10.5
talking 10.5
work 10.3
gray 9.9
handsome 9.8
middle aged 9.7
businessman 9.7
drinking 9.6
age 9.5
hair 9.5
relaxed 9.4
inside 9.2
communicator 9.2
indoor 9.1
relaxing 9.1
outdoors 9
70s 8.8
look 8.8
concentration 8.7
one person 8.5
horizontal 8.4
hand 8.4
drink 8.3
one 8.2
playing 8.2
lady 8.1
activity 8.1
citizen 7.9
gray hair 7.9
sixties 7.8
seated 7.8
two people 7.8
attractive 7.7
professional 7.7
reading 7.6
laughing 7.6
learning 7.5
park 7.4
room 7.3
glass 7.2
kitchen 7.2

Google
created on 2018-03-22

Microsoft
created on 2018-03-22

person 99.9
black 75.1
dish 43.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Female, 67.8%
Calm 49%
Disgusted 4.8%
Angry 9.7%
Confused 7.6%
Sad 16.9%
Happy 3.4%
Surprised 8.7%

AWS Rekognition

Age 23-38
Gender Male, 98%
Calm 64.7%
Disgusted 1.3%
Sad 13%
Happy 0.8%
Surprised 1.8%
Angry 12.3%
Confused 6%

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man and a woman sitting at a table 95.9%
a man and woman sitting at a table 94.8%
a man and a woman sitting on a table 90.5%

Text analysis

Amazon

Tll
LON Tll
LON

Google

LON
I
LON I TW
TW