Human Generated Data

Title

Untitled (Hildy Park reading script with John Garfield on left and author Clifford Odets on right)

Date

1949

People

Artist: W. Eugene Smith, American 1918 - 1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Neal and Susan Yanofsky, P2002.53

Human Generated Data

Title

Untitled (Hildy Park reading script with John Garfield on left and author Clifford Odets on right)

People

Artist: W. Eugene Smith, American 1918 - 1978

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 97.7
Person 96.1
Tie 91.6
Accessories 91.6
Accessory 91.6
Sitting 90.5
Indoors 83.1
Room 81
Overcoat 80.6
Apparel 80.6
Suit 80.6
Coat 80.6
Clothing 80.6
Furniture 72.7
Suit 71
Female 70.2
Couch 68.4
Text 58.1
Living Room 56.4
Court 55.8

Imagga
created on 2022-01-23

business 39.5
people 38.5
corporate 37.8
office 37.3
man 37
person 35.4
businessman 35.3
professional 31.1
adult 30
executive 29.9
businesspeople 28.5
male 28.5
happy 28.2
work 26.7
sitting 25.8
laptop 25
meeting 24.5
team 23.3
suit 23
businesswoman 22.7
computer 21
smile 20
desk 19.9
success 19.3
worker 18.8
group 18.5
successful 18.3
indoor 18.3
attractive 18.2
workplace 18.1
table 17.8
job 17.7
communication 16.8
teacher 16.3
portrait 16.2
room 16.2
working 15.9
expression 15.4
smiling 15.2
lifestyle 15.2
handsome 15.2
confident 14.6
entrepreneur 14.5
home 14.4
happiness 14.1
manager 14
men 13.7
dancer 13.7
black 13.6
looking 13.6
paper 13.5
women 13.4
indoors 13.2
teamwork 13
boss 12.4
career 12.3
performer 11.9
holding 11.6
cheerful 11.4
modern 11.2
pretty 11.2
conference 10.8
face 10.7
corporation 10.6
human 10.5
reading 10.5
standing 10.4
education 10.4
company 10.2
notebook 10.1
occupation 10.1
color 10
hand 9.9
interior 9.7
grand piano 9.7
technology 9.6
couple 9.6
ethnic 9.5
book 9.5
guy 9.3
casual 9.3
two 9.3
staff 9.1
entertainer 9
building 8.9
together 8.8
colleagues 8.7
piano 8.7
busy 8.7
diversity 8.6
educator 8.6
formal 8.6
model 8.6
document 8.4
friendly 8.2
employee 7.8
jacket 7.8
partner 7.7
chair 7.7
leadership 7.7
youth 7.7
horizontal 7.5
contemporary 7.5
clothing 7.5
family 7.1
student 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.6
wall 98.5
clothing 94.3
indoor 89.8
woman 89.5
smile 85.5
text 84.1
human face 74.7
suit 61.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Male, 99.8%
Calm 93.5%
Confused 2.2%
Angry 1.6%
Surprised 0.9%
Disgusted 0.8%
Sad 0.6%
Fear 0.2%
Happy 0.2%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Calm 72.7%
Surprised 10.1%
Sad 5.9%
Angry 4.2%
Confused 3%
Happy 1.6%
Fear 1.4%
Disgusted 1.1%

AWS Rekognition

Age 23-31
Gender Female, 100%
Surprised 44.9%
Calm 32.1%
Fear 14.7%
Angry 3.6%
Happy 1.9%
Sad 1.3%
Disgusted 1.1%
Confused 0.5%

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 49
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 91.6%
Suit 80.6%

Captions

Microsoft

Clifford Odets et al. standing in front of a mirror 83%
Clifford Odets standing in front of a mirror posing for the camera 82.9%
Clifford Odets standing in front of a mirror posing for the camera 82.8%