Human Generated Data

Title

Untitled (two women talking at wedding reception)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8409

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women talking at wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.3
Person 99.3
Person 99.2
Clothing 98.7
Apparel 98.7
Sleeve 85.3
Meal 79.2
Food 79.2
Female 77.7
Face 77.1
Coat 75.3
Suit 75.3
Overcoat 75.3
Dish 68.7
Dessert 67.7
Cake 67.7
Creme 67.7
Icing 67.7
Cream 67.7
Long Sleeve 62
Furniture 61.6
Girl 60.8
Woman 59.9
Table 59.7
Shirt 56

Imagga
created on 2022-01-09

man 37.6
male 26.9
sax 26.8
person 21.1
people 20.6
adult 19.3
men 18.9
world 17.2
professional 17.1
sky 15.9
business 14.6
work 13.3
businessman 13.2
active 12.6
sea 12.5
beach 12
worker 11.7
ocean 11.6
portrait 11
sport 10.8
job 10.6
outdoors 10.4
looking 10.4
wind instrument 10.1
lifestyle 10.1
happy 10
outdoor 9.9
sunset 9.9
travel 9.8
couple 9.6
guy 9.5
women 9.5
career 9.5
water 9.3
one 8.9
life 8.8
standing 8.7
corporate 8.6
executive 8.5
human 8.2
fun 8.2
industrial 8.2
suit 8.1
equipment 8
engineer 7.9
love 7.9
holiday 7.9
boy 7.8
summer 7.7
outside 7.7
old 7.7
winter 7.6
relax 7.6
leisure 7.5
manager 7.4
building 7.4
technology 7.4
vacation 7.4
office 7.3
success 7.2
smiling 7.2
black 7.2
coast 7.2
smile 7.1
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 94.3
text 94.1
black and white 93.3
clothing 92.1
human face 91.7
drawing 71.9

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.8%
Sad 57.5%
Calm 25.6%
Surprised 11.2%
Happy 3.1%
Confused 0.9%
Angry 0.7%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 51-59
Gender Male, 99.6%
Calm 75.6%
Happy 6.7%
Surprised 5.6%
Confused 4.3%
Sad 2.7%
Angry 2.3%
Disgusted 2%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 65.7%
a person standing in front of a mirror 65.6%
a group of people standing in front of a mirror posing for the camera 61.9%

Text analysis

Amazon

12157
12157.
IS3
**************

Google

Lal
12151.
12157.
12151. Lal 12157. YT37A2- NAMT
YT37A2-
NAMT