Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5298

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5298

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.1
Human 99.1
Person 98.5
Person 97.7
Drawing 93.5
Art 93.5
Sketch 90.2
Chef 59.3

Clarifai
created on 2023-10-26

people 99.1
adult 98.5
man 97.6
woman 96.7
monochrome 95.6
wear 91.4
lid 88.7
vehicle 88.6
veil 88.1
group 85.8
two 85.2
uniform 84
portrait 78.4
transportation system 77.8
chalk out 76
no person 74.8
medical practitioner 74.5
illustration 68.8
desktop 68.5
outfit 68

Imagga
created on 2022-01-22

snow 33.6
winter 27.2
negative 26.9
cold 24.1
film 22.3
ice 22.2
sketch 21.4
people 18.4
drawing 17.3
person 16.9
photographic paper 16.5
shovel 14.6
weather 14
car 13.7
representation 12.9
man 12.8
business 12.7
work 11.8
sport 11.5
frozen 11.5
human 11.2
photographic equipment 11
tool 10.8
vehicle 10.6
glass 10.6
businessman 10.6
hand tool 10.4
men 10.3
male 9.9
travel 9.9
worker 9.8
freeze 9.8
fun 9.7
medical 9.7
adult 9.7
medicine 9.7
group 9.7
landscape 9.7
frost 9.6
season 9.3
holiday 9.3
professional 9.3
equipment 9.2
snowy 8.7
water 8.7
coat 8.5
doctor 8.5
health 8.3
outdoors 8.2
mountain 8.1
team 8.1
white 8
working 7.9
drink 7.5
clean 7.5
one 7.5
technology 7.4
occupation 7.3
transport 7.3
transportation 7.2
activity 7.2
cool 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

drawing 96.8
text 96.6
sketch 96.1
black and white 65.6
clothing 63.7
appliance 62.2
person 58.9
linedrawing 21.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 97%
Calm 99.4%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 86.9%
Sad 46.1%
Angry 39%
Calm 8.7%
Fear 2.7%
Happy 1.9%
Surprised 0.6%
Disgusted 0.6%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

5888
Y133A2
3EAB Y133A2 A30M3330
3EAB
A30M3330

Google

888 5 BEAB YT3HA2 R3OH3330
888
5
BEAB
YT3HA2
R3OH3330