Human Generated Data

Title

Untitled (two women in lobby of theater, Harvard Hasty Pudding Tour)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11715

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women in lobby of theater, Harvard Hasty Pudding Tour)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11715

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 99.9
Apparel 99.9
Person 98.8
Human 98.8
Person 98.6
Person 97.9
Person 97.8
Person 95.4
Suit 91.2
Overcoat 91.2
Coat 91.2
Robe 82.4
Fashion 82.4
Gown 79.6
Female 78.4
Face 74.3
Helmet 73.3
Wedding 73.1
Dress 71.6
Bridegroom 69
Sunglasses 66.8
Accessories 66.8
Accessory 66.8
Wedding Gown 66.4
Tuxedo 65
Woman 64.6
People 64.3
Advertisement 63.8
Portrait 62.5
Photography 62.5
Photo 62.5
Poster 59.7
Text 58.8
Nature 56.3

Clarifai
created on 2023-10-26

people 99.9
group 99
man 97.9
adult 97.8
group together 95.9
woman 95.1
many 93.8
wear 93
leader 91
monochrome 90.3
several 88.6
street 87.9
administration 87.2
three 85
veil 83
religion 82.3
four 80.5
child 78.9
actor 78.2
recreation 77.1

Imagga
created on 2022-01-15

man 31.6
male 29.8
people 29
person 27.7
businessman 20.3
business 20
adult 18.1
world 15.3
group 14.5
happy 14.4
life 14.2
newspaper 13.7
job 13.3
black 13.2
men 12.9
executive 12.8
human 12.7
work 12.6
corporate 12
happiness 11.7
portrait 11.6
team 11.6
professional 10.6
couple 10.4
office 10.4
looking 10.4
teacher 10.3
women 10.3
day 10.2
worker 10
outdoors 9.8
product 9.7
building 9.7
room 9.5
color 9.5
smiling 9.4
back 9.2
city 9.1
holding 9.1
suit 9
success 8.8
standing 8.7
lifestyle 8.7
face 8.5
student 8.5
old 8.4
hand 8.4
groom 8
to 8
clothing 7.8
architecture 7.8
creation 7.6
casual 7.6
career 7.6
guy 7.5
senior 7.5
tourist 7.4
board 7.2
dress 7.2
smile 7.1
photographer 7.1
employee 7.1
love 7.1
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.7
person 97.5
standing 88.8
indoor 86.5
clothing 85.7
drawing 78
black and white 66.8
man 61.9
woman 58.2
street 57.3
clothes 17.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 97.8%
Calm 99.9%
Sad 0%
Surprised 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 37-45
Gender Male, 98.7%
Sad 73.8%
Calm 22.2%
Surprised 1.3%
Angry 1%
Fear 0.6%
Confused 0.5%
Disgusted 0.5%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Male, 92%
Surprised 98.5%
Fear 0.5%
Angry 0.4%
Calm 0.3%
Disgusted 0.1%
Sad 0.1%
Happy 0.1%
Confused 0%

AWS Rekognition

Age 50-58
Gender Male, 99.7%
Calm 93%
Happy 6.7%
Disgusted 0.1%
Confused 0.1%
Sad 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 98.9%
Surprised 0.4%
Angry 0.3%
Sad 0.2%
Happy 0.1%
Disgusted 0.1%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Helmet 73.3%
Sunglasses 66.8%

Categories

Text analysis

Amazon

+
.El
32A8
13,
13.
know
٢٤١+ + know .D.9.H.H ,El
,El
٢٤١+
Y199A2
32A8 Y199A2 830M3330
.D.9.H.H
830M3330