Human Generated Data

Title

Untitled (couple talking with woman at party)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4947

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple talking with woman at party)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.3
Person 99.2
Clothing 95.1
Apparel 95.1
Accessories 93.6
Accessory 93.6
Sunglasses 93.6
Face 83.4
Female 82.5
Drink 73
Beverage 73
Woman 67.1
Finger 67.1
People 63.9
Dating 62.1
Sleeve 61.7
Girl 61.7
Dress 60.3
Photography 60.1
Photo 60.1
Crowd 59
Alcohol 55.6
Sitting 55.5

Imagga
created on 2022-01-23

people 34.6
man 34.2
person 32
male 26.9
business 23.7
businessman 22.9
adult 19.4
work 19
senior 18.7
office 17.2
team 15.2
professional 14.9
couple 14.8
executive 14.7
happy 14.4
portrait 14.2
looking 13.6
human 13.5
group 12.9
worker 12.6
old 12.5
together 12.3
manager 12.1
hand 11.4
world 10.5
brass 10.4
men 10.3
mature 10.2
two 10.2
job 9.7
love 9.5
sitting 9.4
desk 9.4
meeting 9.4
casual 9.3
planner 9.1
businesswoman 9.1
medical 8.8
women 8.7
smiling 8.7
education 8.7
elderly 8.6
corporate 8.6
doctor 8.5
wind instrument 8.4
black 8.4
teamwork 8.3
health 8.3
suit 8.2
care 8.2
success 8
family 8
working 7.9
indoors 7.9
smile 7.8
newspaper 7.7
student 7.7
youth 7.7
communication 7.6
patient 7.5
room 7.5
silhouette 7.4
musical instrument 7.4
lady 7.3
teacher 7.3
computer 7.2
handsome 7.1
face 7.1
jacket 7.1
medicine 7
nurse 7
modern 7
look 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.5
clothing 92.5
man 83.2
standing 79.5
black and white 69.4
text 65.2
human face 63

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 60.6%
Angry 40.1%
Happy 20.2%
Sad 14.6%
Disgusted 6.5%
Surprised 6.4%
Confused 5.1%
Calm 4.8%
Fear 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Sunglasses 93.6%

Captions

Microsoft

a group of people standing in a room 93.8%
a couple of people that are standing in a room 88.1%
a group of people standing around each other 86.7%

Text analysis

Amazon

12664.

Google

12664. 12664.
12664.