Human Generated Data

Title

Untitled (women gathered around table at Tupperware party)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7203

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women gathered around table at Tupperware party)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 98.8
Person 98.8
Person 98.3
Person 94.1
Person 89.6
Clothing 85.3
Apparel 85.3
Indoors 75.3
Room 75.3
Furniture 70.4
Person 69.3
Clinic 68.8
People 64.5
Female 60.2
Shorts 56.3
Building 56.3
Table 55.3
Person 53.3

Imagga
created on 2022-01-08

bass 21.1
people 20.1
person 18.9
man 18.8
work 17.5
photographer 15.6
musical instrument 15.4
male 14.9
professional 14.2
technology 14.1
music 13.6
wind instrument 13.5
team 13.4
play 12.9
brass 12.8
instrument 12.2
business 12.1
art 11.8
equipment 11.5
design 11.2
silhouette 10.7
negative 10.7
film 10.5
grunge 10.2
black 10.2
symbol 10.1
hand 9.9
modern 9.8
science 9.8
medical 9.7
crowd 9.6
musical 9.6
glass 9.3
event 9.2
player 9.2
studio 9.1
fashion 9
sexy 8.8
computer 8.8
medicine 8.8
audience 8.8
concert 8.7
rock 8.7
men 8.6
doctor 8.5
power 8.4
health 8.3
musician 8.3
occupation 8.2
worker 8.2
adult 8.1
digital 8.1
graphic 8
working 7.9
businessman 7.9
smile 7.8
device 7.8
party 7.7
coat 7.7
hot 7.5
drink 7.5
human 7.5
world 7.5
lights 7.4
style 7.4
teamwork 7.4
paint 7.2
decoration 7.2
bright 7.1
portrait 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 85.6
text 77.9
clothing 75.5

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 97.7%
Calm 96.6%
Sad 2.8%
Confused 0.2%
Happy 0.2%
Surprised 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Female, 99.9%
Calm 55.1%
Happy 41%
Sad 1.5%
Surprised 0.8%
Fear 0.6%
Angry 0.4%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 37-45
Gender Female, 85%
Calm 84.7%
Happy 12.7%
Sad 0.9%
Confused 0.7%
Disgusted 0.4%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people standing in front of a window 66%
a group of people in front of a window 65.9%
a group of people sitting in front of a window 52%

Text analysis

Amazon

Brties

Google

3rties
3rties