Human Generated Data

Title

Untitled (two women talking, Wedding, Pennsylvania)

Date

1941, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.306

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women talking, Wedding, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 98.8
Dance Pose 89.2
Leisure Activities 89.2
Accessory 72.4
Accessories 72.4
Dance 58.8
Jewelry 58.7

Imagga
created on 2022-01-08

person 38.6
adult 38.3
sexy 35.3
attractive 33.6
portrait 33
people 31.8
sensual 28.2
hair 27.7
fashion 27.1
model 25.7
pretty 24.5
happy 24.4
brunette 24.4
love 23.7
black 22.2
waiter 21.3
couple 20.9
face 20.6
style 19.3
studio 19
man 18.8
passion 18.8
male 18.5
lady 17.9
body 17.6
happiness 17.2
dining-room attendant 17.1
erotic 17.1
dark 16.7
sensuality 16.3
elegance 16
cute 15.8
smile 15.7
employee 15.3
posing 15.1
clothing 13.9
youth 13.6
skin 13.5
human 13.5
smiling 13
make 12.7
dress 12.6
romantic 12.5
lips 12
looking 12
women 11.9
suit 11.8
boyfriend 11.6
seductive 11.5
expression 11.1
two 11
mother 10.8
girlfriend 10.6
together 10.5
wife 10.4
groom 10.3
sitting 10.3
relationship 10.3
casual 10.2
20s 10.1
makeup 10.1
gorgeous 10
retro 9.8
passionate 9.8
handsome 9.8
lovely 9.8
one 9.7
lingerie 9.6
elegant 9.4
feminine 9.3
vintage 9.1
home 8.8
lifestyle 8.7
desire 8.7
eyes 8.6
worker 8.6
bed 8.5
enjoy 8.5
lying 8.5
fun 8.2
husband 8.2
cheerful 8.1
child 8
lover 7.8
vogue 7.7
modern 7.7
adolescent 7.7
bride 7.7
jeans 7.6
hand 7.6
relaxation 7.5
guy 7.4
hat 7.4
wedding 7.4
indoor 7.3
girls 7.3
disk jockey 7.2
romance 7.1
night 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.5
person 99.3
clothing 96.9
woman 93.3
human face 90.8
window 81.3
drawing 65.7
smile 60.3
retro 54.6
sketch 53.4
old 42.1
picture frame 15.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.9%
Calm 60.7%
Sad 15.3%
Fear 14.9%
Angry 4.8%
Confused 1.9%
Disgusted 1.3%
Surprised 0.8%
Happy 0.3%

AWS Rekognition

Age 23-31
Gender Female, 99.9%
Calm 77.7%
Sad 11.9%
Confused 4.9%
Angry 3.3%
Fear 0.8%
Disgusted 0.7%
Surprised 0.6%
Happy 0.2%

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

an old photo of a woman 93.6%
a woman sitting next to a window 77.3%
old photo of a woman 77.2%