Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11908

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 98.7
Person 96.8
Drawing 96.5
Art 96.5
Person 95.2
Sketch 92.5
Person 78.3
Helmet 74.9
Clothing 74.9
Apparel 74.9
Bus 74.5
Transportation 74.5
Vehicle 74.5
People 71.4
Person 69.1

Imagga
created on 2022-01-15

man 32.3
car 28
passenger 27.5
people 25.1
person 23.8
male 22.7
adult 18.8
travel 18.3
transportation 17.9
business 17.6
happy 17.5
groom 16.4
vehicle 16.1
automobile 14.4
women 14.2
smile 13.5
work 13.3
seller 13.3
businessman 12.4
smiling 12.3
office 12.2
men 12
professional 11.1
portrait 11
lifestyle 10.8
looking 10.4
building 10.4
sitting 10.3
black 10.2
transport 10
modern 9.8
group 9.7
success 9.6
couple 9.6
drive 9.5
worker 9.4
life 8.9
handsome 8.9
auto 8.6
architecture 8.6
corporate 8.6
businesspeople 8.5
casual 8.5
suit 8.4
occupation 8.2
20s 8.2
outdoors 8.2
road 8.1
job 8
standing 7.8
attractive 7.7
industry 7.7
hospital 7.7
winter 7.7
city 7.6
snow 7.6
walking 7.6
trip 7.5
doctor 7.5
engineer 7.5
side 7.5
manager 7.4
company 7.4
sketch 7.4
drawing 7.4
street 7.4
coat 7.2
team 7.2
motor vehicle 7.1
working 7.1
medical 7.1
happiness 7
together 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

drawing 99.2
sketch 98.8
text 97.3
person 95.2
outdoor 93.2
man 91.3
child art 75.3
painting 70.2
black and white 68.1
old 59.8
posing 50.3

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 53%
Calm 97.4%
Surprised 0.8%
Confused 0.5%
Angry 0.3%
Sad 0.3%
Disgusted 0.3%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Male, 90.6%
Surprised 35.2%
Fear 24.6%
Sad 14.9%
Confused 12.3%
Calm 9.7%
Disgusted 1.9%
Angry 0.8%
Happy 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%
Helmet 74.9%
Bus 74.5%

Captions

Microsoft

a group of people posing for a photo 88.6%
a group of people posing for the camera 88.5%
a group of men posing for a photo 86.5%

Text analysis

Amazon

5889
YT33A2
830M3330
FEAB YT33A2 830M3330
FEAB

Google

6885 EAB YT33A2 A3OH330
6885
EAB
YT33A2
A3OH330