Human Generated Data

Title

Untitled (couple dancing)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4946

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple dancing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.7
Apparel 99.7
Human 99.5
Person 99.5
Person 99.1
Person 98.4
Dress 97.5
Person 97.1
Person 93.7
Female 91
Sleeve 83.8
Face 80.6
Woman 77.2
Suit 74.7
Coat 74.7
Overcoat 74.7
Leisure Activities 74
Fashion 71.2
Gown 71.2
Wedding 70.9
Bridegroom 70.9
Dance Pose 70.8
Shirt 69.9
Accessories 69.6
Accessory 69.6
Robe 67.2
Photo 65.1
Photography 65.1
Man 64
Text 63.1
Glasses 62.5
Pants 62.1
Jewelry 59.4
Hug 59
Wedding Gown 58.4
Long Sleeve 58.2
Tie 56.4
Finger 55.9
Evening Dress 55.4
Girl 55.1

Imagga
created on 2022-01-23

brass 33.2
people 29
wind instrument 27.1
adult 23.6
person 22.4
male 20.6
man 20.1
musical instrument 18.9
bride 17.4
portrait 16.2
love 15
cornet 14.8
wedding 14.7
face 13.5
sexy 12.8
couple 12.2
device 12.1
happy 11.9
model 11.7
professional 11.6
black 11.4
fashion 11.3
bass 11.2
men 11.2
women 11.1
dress 10.8
groom 10.6
human 10.5
pretty 10.5
work 10.2
happiness 10.2
two 10.2
attractive 9.8
lady 9.7
health 9.7
businessman 9.7
clothing 9.6
body 9.6
brunette 9.6
hair 9.5
party 9.4
business 9.1
romantic 8.9
style 8.9
life 8.9
bridal 8.7
serious 8.6
marriage 8.5
art 8.4
old 8.4
event 8.3
valentine 8.2
cheerful 8.1
light 8
science 8
celebration 8
job 8
medical 7.9
horn 7.9
smile 7.8
ceremony 7.8
wife 7.6
active 7.6
bouquet 7.5
holding 7.4
technology 7.4
sport 7.4
occupation 7.3
worker 7.3
patient 7.2
student 7.2
lifestyle 7.2
suit 7.2
team 7.2
dance 7
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 97.6
standing 96.6
wedding dress 90.2
text 89.7
clothing 89.5
dress 88.2
bride 87.7
human face 86.1
people 85.7
woman 81.7
wedding 78.1
old 73.7
posing 71.8
black and white 69.3
smile 66.5
dance 50

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 89.1%
Fear 56.9%
Surprised 17.7%
Happy 8.8%
Calm 6.9%
Angry 3.3%
Sad 2.5%
Disgusted 2.2%
Confused 1.7%

AWS Rekognition

Age 47-53
Gender Female, 81.4%
Calm 74.1%
Sad 12%
Happy 7.9%
Confused 1.9%
Angry 1.4%
Disgusted 1.1%
Fear 0.8%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people posing for a photo 96%
a person standing in front of a group of people posing for a photo 91.7%
a person standing in front of a group of people posing for the camera 91.6%

Text analysis

Amazon

12659
65921
12659.
VT3RA2
65921 VT3RA2 - ИАМТА
- ИАМТА

Google

12659.
2659.
65921
12659. 2659. 65921