Human Generated Data

Title

Untitled (formally dressed man and woman dancing)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5650

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dressed man and woman dancing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 100
Apparel 100
Person 99.2
Human 99.2
Person 98.4
Robe 96
Fashion 96
Gown 95.1
Wedding 94
Person 93.8
Person 93.2
Bride 88.2
Wedding Gown 88.2
Bridegroom 82.2
Female 72.5
Woman 60.7
Text 56.4

Imagga
created on 2021-12-15

person 29.9
people 27.9
groom 24.7
negative 24.2
portrait 22
attractive 21.7
face 19.9
film 19.4
adult 18.9
sexy 18.5
happy 18.2
fashion 18.1
pretty 16.1
human 15.7
male 15.6
bride 14.5
photographic paper 14.2
man 14.1
hair 13.5
love 13.4
black 13.2
model 13.2
cute 12.9
smile 12.8
dress 12.6
blond 12.6
professional 12.1
eyes 12
skin 11.8
happiness 11.7
science 11.6
lady 11.4
lifestyle 10.8
romantic 10.7
couple 10.5
men 10.3
women 10.3
wedding 10.1
sensuality 10
cheerful 9.8
medical 9.7
chemistry 9.7
photographic equipment 9.6
smiling 9.4
light 9.4
makeup 9.1
studio 9.1
student 9.1
health 9
technology 8.9
color 8.9
style 8.9
looking 8.8
body 8.8
instrument 8.8
lab 8.7
scientific 8.7
laboratory 8.7
education 8.7
work 8.6
research 8.6
biology 8.5
club 8.5
modern 8.4
joy 8.4
care 8.2
photographer 8.1
coat 8
lovely 8
microscope 7.9
holiday 7.9
scientist 7.8
chemical 7.8
party 7.7
two 7.6
elegance 7.6
study 7.5
girls 7.3
dance 7.3
worker 7.2
medicine 7
together 7
look 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.7
wedding 92.2
clothing 89.6
person 85.2
woman 76.5
wedding dress 73.8
human face 64.3
bride 54.2

Face analysis

Amazon

Google

AWS Rekognition

Age 29-45
Gender Female, 93%
Happy 79.5%
Calm 18.1%
Sad 1%
Confused 0.7%
Angry 0.4%
Surprised 0.3%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 20-32
Gender Female, 50.8%
Calm 46.1%
Sad 27.9%
Angry 15.4%
Confused 5.2%
Surprised 3.8%
Fear 0.7%
Happy 0.5%
Disgusted 0.2%

AWS Rekognition

Age 49-67
Gender Female, 56.2%
Sad 74%
Calm 15.5%
Fear 5.2%
Confused 3.9%
Happy 0.6%
Surprised 0.5%
Angry 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people standing around each other 73.4%
a group of people around each other 73.3%
a group of people standing in front of a wall 73.2%

Text analysis

Amazon

14608
809ml
A7BA
ИА9ЯЗ9U2 A7BA
ИА9ЯЗ9U2
IL

Google

14608. 14606
14608.
14606