Human Generated Data

Title

Untitled (Princeton graduate of 1917 with family, Princeton University reunion, Princeton, NJ)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7661

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Princeton graduate of 1917 with family, Princeton University reunion, Princeton, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7661

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 99.7
Apparel 99.7
Person 99.4
Human 99.4
Person 99.3
Person 99.2
Person 99.2
Person 93.5
Person 93.2
Person 92.1
Female 90.5
Shorts 89.9
Face 87
Dress 84.8
Nature 82.8
Outdoors 82.8
Countryside 82.8
Shelter 82.8
Rural 82.8
Building 82.8
People 78.7
Person 73.9
Overcoat 71.9
Coat 71.9
Person 71.9
Woman 71.3
Suit 69.8
Kid 68.9
Child 68.9
Girl 68.7
Person 67.5
Photo 64.9
Portrait 64.9
Photography 64.9
Floor 56.7
Shoe 56.7
Footwear 56.7
Flooring 55.6
Shoe 52.8
Person 46.8

Clarifai
created on 2023-10-25

people 99.9
group together 98.9
child 98
group 97.1
adult 95.1
woman 94.7
many 93.8
wear 92.5
several 90.5
recreation 89.8
man 88.5
sports equipment 87.4
administration 85.3
five 83
portrait 82.9
monochrome 81
boy 79.7
veil 77.4
leader 75
two 74.9

Imagga
created on 2022-01-08

kin 22.8
statue 21.9
sculpture 20.5
art 19.2
people 18.4
old 17.4
religion 17
clothing 14.2
ancient 13.8
man 13.8
monument 13.1
musical instrument 12.9
culture 12.8
travel 12
dress 11.7
traditional 11.6
male 11.4
architecture 10.9
adult 10.9
history 10.7
fashion 10.5
accordion 10.4
black 10.4
antique 10.4
style 10.4
outfit 10.3
world 10
city 10
portrait 9.7
marble 9.7
person 9.7
child 9.5
religious 9.4
stone 9.3
shop 9.1
tourism 9.1
lady 8.9
women 8.7
scene 8.7
god 8.6
holiday 8.6
walking 8.5
face 8.5
keyboard instrument 8.3
church 8.3
vintage 8.3
wind instrument 8.2
human 8.2
holy 7.7
uniform 7.5
tradition 7.4
street 7.4
figure 7.3
makeup 7.3
military uniform 7.3
girls 7.3
catholic 7.3
tourist 7.2
family 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

clothing 91.6
person 89.3
black and white 88.1
outdoor 86.1
text 84.3
footwear 82.6
dress 70.7
street 50.2
posing 47.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Happy 42.2%
Calm 37.6%
Fear 14.4%
Sad 2.1%
Surprised 1.5%
Angry 0.8%
Disgusted 0.8%
Confused 0.5%

AWS Rekognition

Age 43-51
Gender Male, 89.3%
Happy 76.5%
Calm 17.4%
Surprised 3.3%
Sad 0.8%
Angry 0.8%
Confused 0.5%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 42-50
Gender Male, 100%
Sad 91.5%
Angry 3.4%
Confused 2%
Calm 1.3%
Disgusted 0.9%
Happy 0.3%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Happy 72.3%
Calm 19.8%
Fear 3.1%
Disgusted 1.5%
Sad 1.5%
Surprised 0.7%
Confused 0.7%
Angry 0.5%

AWS Rekognition

Age 34-42
Gender Female, 73.8%
Happy 94.5%
Calm 2.1%
Sad 2%
Angry 0.7%
Confused 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Female, 96.8%
Calm 97.9%
Angry 0.8%
Sad 0.6%
Disgusted 0.3%
Surprised 0.2%
Confused 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 91.2%
Calm 56.1%
Sad 28.6%
Fear 4.5%
Confused 3%
Happy 2.8%
Surprised 1.8%
Angry 1.7%
Disgusted 1.5%

AWS Rekognition

Age 25-35
Gender Female, 99.8%
Happy 86.3%
Calm 7.1%
Fear 2.6%
Sad 1.6%
Surprised 1.3%
Disgusted 0.4%
Confused 0.3%
Angry 0.3%

AWS Rekognition

Age 23-31
Gender Male, 72.5%
Calm 51.2%
Happy 13.9%
Sad 13.3%
Disgusted 7.1%
Angry 6.1%
Fear 5%
Surprised 1.7%
Confused 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.4%
Shoe 56.7%

Categories

Text analysis

Amazon

17
D
1917
X
No33
70ww
1817

Google

191
17
191 17