Human Generated Data

Title

Untitled (birds eye view of debutante reception)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8434

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (birds eye view of debutante reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.9
Human 98.9
Person 97.4
Person 97.3
Clothing 95.5
Apparel 95.5
Person 92.1
Collage 86.9
Advertisement 86.9
Poster 86.9
Outdoors 85.7
Nature 82.1
Chair 78
Furniture 78
Female 70.9
People 68.8
Crowd 68.7
Shorts 65.8
Face 65.6
Person 65.5
Girl 62.1
Person 60.8
Table 57
Painting 56.4
Art 56.4
Paper 55.7
Sand 55.2
Person 48.3

Imagga
created on 2022-01-09

people 25.1
male 20.6
group 20.1
person 18.9
silhouette 18.2
sport 17.5
man 17.5
photographer 16.4
men 15.4
black 15.1
world 13.8
sexy 13.6
adult 12.5
player 12.4
fashion 12
girls 11.8
stage 11.6
party 11.2
women 11.1
city 10.8
team 10.7
athlete 10.7
crowd 10.6
style 10.4
portrait 10.3
clothing 10.3
art 10.2
music 9.9
business 9.7
happy 9.4
model 9.3
street 9.2
teen 9.2
teenager 9.1
attractive 9.1
pose 9
fun 9
posing 8.9
businessman 8.8
light 8.7
active 8.6
youth 8.5
dance 8.5
event 8.3
leisure 8.3
spectator 8
night 8
lifestyle 7.9
child 7.9
boy 7.8
play 7.7
travel 7.7
performance 7.6
studio 7.6
dark 7.5
friendship 7.5
toyshop 7.3
paint 7.2
color 7.2
body 7.2
sunset 7.2
musician 7.2
love 7.1
ball 7.1
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.9
outdoor 97
posing 92.8
person 92.2
black and white 80.1
group 77.3

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 92.5%
Happy 93.5%
Sad 4%
Confused 0.8%
Calm 0.7%
Fear 0.3%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 35-43
Gender Male, 86.2%
Sad 37.6%
Calm 22%
Happy 19.7%
Angry 6.5%
Fear 5.9%
Confused 4.8%
Surprised 1.8%
Disgusted 1.6%

AWS Rekognition

Age 40-48
Gender Male, 97.4%
Calm 48.6%
Sad 26.6%
Surprised 9%
Happy 6.2%
Angry 4.1%
Confused 2%
Disgusted 2%
Fear 1.5%

AWS Rekognition

Age 28-38
Gender Male, 53.3%
Happy 83.3%
Calm 7.5%
Sad 4.4%
Confused 2.7%
Surprised 0.8%
Fear 0.5%
Angry 0.4%
Disgusted 0.4%

AWS Rekognition

Age 34-42
Gender Male, 91.8%
Sad 38.9%
Calm 26.2%
Happy 25.3%
Confused 3.6%
Fear 3%
Disgusted 1.7%
Angry 0.8%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 98.9%
Painting 56.4%

Captions

Microsoft

a group of people posing for a photo 94.9%
a group of people posing for the camera 94.8%
a group of people posing for a picture 94.7%

Text analysis

Amazon

33
170. 33
170.
0

Google

77033
77033