Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5294

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 98
Person 98
Person 97
Apparel 94.6
Clothing 94.6
Person 89.4
Person 86.3
Helmet 83.5
Person 69.2
Hat 59
Art 58.8
Drawing 58.8
Robe 58.8
Fashion 58.8
Sketch 56.5

Imagga
created on 2022-01-22

adult 30.6
people 27.9
business 26.7
person 26.5
professional 25.8
medical 25.6
coat 25.4
male 24.8
businessman 23.8
doctor 23.5
work 22.8
office 22.7
medicine 22
man 21.5
lab coat 21.2
portrait 19.4
worker 19
businesspeople 19
corporate 18.9
hospital 18.9
men 18
clothing 17.9
nurse 17.7
care 17.3
clinic 17.2
job 16.8
looking 16.8
uniform 16.7
health 16
occupation 15.6
businesswoman 15.4
attractive 15.4
happy 15
working 15
computer 14.4
smile 13.5
laboratory 13.5
profession 13.4
desk 13.3
table 13
smiling 12.3
manager 12.1
day 11.8
indoors 11.4
one 11.2
home 11.2
casual 11
laptop 10.9
face 10.7
room 10.6
human 10.5
suit 10.3
film 10.3
executive 10.1
model 10.1
indoor 10
bright 10
garment 9.9
modern 9.8
colleagues 9.7
lab 9.7
serious 9.5
20s 9.2
lady 8.9
staff 8.9
women 8.7
instrument 8.6
boutique 8.6
research 8.6
clothes 8.4
color 8.3
holding 8.3
confident 8.2
negative 8.1
team 8.1
group 8.1
success 8
light 8
associates 7.9
stethoscope 7.9
brunette 7.8
clinical 7.8
physician 7.8
businessmen 7.8
assistant 7.8
refrigerator 7.8
specialist 7.7
corporation 7.7
pretty 7.7
white goods 7.6
hand 7.6
focus 7.4
jacket 7.4
alone 7.3
tool 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.2
black and white 90.8
sketch 83.7
drawing 74.4
appliance 55.9
old 52.2

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 79.5%
Calm 81.3%
Happy 5.4%
Confused 4%
Angry 2.7%
Disgusted 2.5%
Surprised 1.9%
Sad 1.6%
Fear 0.5%

AWS Rekognition

Age 28-38
Gender Male, 85.2%
Surprised 33%
Calm 16.2%
Confused 14.3%
Sad 12.2%
Happy 8.6%
Angry 6.2%
Fear 5.2%
Disgusted 4.2%

AWS Rekognition

Age 18-24
Gender Male, 95.8%
Calm 85.1%
Happy 8.1%
Confused 4.4%
Angry 0.9%
Surprised 0.5%
Sad 0.4%
Disgusted 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Helmet 83.5%
Hat 59%

Captions

Microsoft

a man standing in front of a refrigerator 64.7%
an old photo of a man 64.6%
a man standing next to a refrigerator 60.2%

Text analysis

Amazon

5893
25.04

Google

5893
5893