Human Generated Data

Title

Untitled (two junior EA Corps. women and couple)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4482

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two junior EA Corps. women and couple)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 98.6
Person 98.6
Person 97.2
Person 95.4
Clothing 90.8
Apparel 90.8
Hat 90.8
Person 89.6
Person 88.7
Leisure Activities 75.6
Person 74.3
Clinic 63
Person 60.4
Robot 57.3
Costume 56.4
Scientist 55.4
Crowd 55.2

Imagga
created on 2022-01-23

person 23
people 21.2
brass 20.7
man 20.2
art 18.6
wind instrument 16.9
human 15.7
silhouette 15.7
cornet 14.6
party 14.6
fun 12.7
style 12.6
medical 12.4
fashion 12.1
cartoon 11.6
musical instrument 11.5
medicine 11.4
biology 11.4
male 11.3
grunge 11.1
coat 10.9
design 10.7
science 10.7
bass 10.4
play 10.3
music 10.2
decoration 10.1
dance 10
studio 9.9
hand 9.9
team 9.9
worker 9.8
health 9.7
negative 9.7
black 9.6
instrument 9.6
professional 9.6
work 9.4
holiday 9.3
adult 9.2
sport 9.1
retro 9
lab 8.7
laboratory 8.7
winter 8.5
doctor 8.5
modern 8.4
one 8.2
equipment 8.2
technology 8.2
happy 8.1
graphic 8
celebration 8
player 7.8
scientist 7.8
rock 7.8
development 7.8
scientific 7.7
gift 7.7
chemistry 7.7
chemical 7.7
research 7.6
club 7.5
costume 7.5
clothing 7.4
event 7.4
lady 7.3
dress 7.2
body 7.2
active 7.2
hair 7.1
working 7.1
happiness 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.8
black and white 83.2
sketch 82.4
drawing 81.2
clothing 79.4
posing 70.4
cartoon 64.3
person 61.3
bowed instrument 8.3

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Calm 76.5%
Confused 10.3%
Surprised 5%
Happy 3.5%
Disgusted 1.6%
Angry 1.5%
Sad 1.2%
Fear 0.5%

AWS Rekognition

Age 29-39
Gender Female, 95.6%
Sad 84.5%
Confused 8.4%
Calm 2.7%
Happy 1.8%
Disgusted 0.9%
Angry 0.8%
Fear 0.7%
Surprised 0.3%

AWS Rekognition

Age 24-34
Gender Male, 90.4%
Calm 67.2%
Fear 15.1%
Surprised 11%
Sad 2.6%
Happy 1.9%
Confused 0.8%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 40-48
Gender Male, 91.9%
Happy 97.8%
Confused 0.8%
Sad 0.4%
Disgusted 0.3%
Calm 0.3%
Surprised 0.2%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Hat 90.8%

Captions

Microsoft

a person holding a guitar 53.3%
a group of men posing for a photo 53.2%
a person standing next to a guitar 50.3%

Text analysis

Amazon

Hospital
EA
JUNIOR
21448
CORPS
Ciscopal

Google

31448.
tosplal
31448. tosplal