Human Generated Data

Title

Untitled (two couples under awning, Hasty Pudding Tour)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7642

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples under awning, Hasty Pudding Tour)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7642

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Person 99.5
Apparel 98.2
Clothing 98.2
Person 98.2
Person 96.3
Tie 95.3
Accessories 95.3
Accessory 95.3
Tie 93.3
Helmet 84
Airplane 79.5
Transportation 79.5
Vehicle 79.5
Aircraft 79.5
Helicopter 76.5
Face 70.5
Coat 64.8
People 64.7
Text 63.9

Clarifai
created on 2023-10-25

people 99.8
group 98.7
group together 98.4
wear 98.3
man 98.1
adult 97.9
veil 94.6
three 94.6
four 94.5
outfit 94.4
uniform 93.8
two 93.5
many 92
several 91.9
five 91.3
military 88.9
outerwear 88.2
woman 84.4
coat 81
vehicle 79.1

Imagga
created on 2022-01-08

nurse 52.8
person 34.6
man 28.9
patient 28.1
male 25.5
adult 19.7
case 18.8
people 18.4
medical 17.6
coat 17.6
clothing 16.6
sick person 15.9
lab coat 15.9
professional 15
doctor 15
men 14.6
business 12.7
hospital 12.7
health 12.5
happy 11.9
portrait 11
smile 10.7
businessman 10.6
garment 10.4
work 10.2
care 9.9
job 9.7
new 9.7
medicine 9.7
black 9.6
face 9.2
dress 9
mask 8.8
full length 8.7
bride 8.6
happiness 8.6
serious 8.6
adults 8.5
worker 8.4
uniform 8.1
building 7.9
day 7.8
couple 7.8
corporate 7.7
modern 7.7
industry 7.7
profession 7.7
student 7.6
city 7.5
one 7.5
life 7.4
window 7.4
wedding 7.4
room 7.3
office 7.2
looking 7.2
art 7.2
groom 7.1
family 7.1
stethoscope 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.1
clothing 86.8
person 85.1
old 68.6
man 65.2
white 61.6
posing 44.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.7%
Surprised 43.5%
Calm 35.8%
Happy 9.5%
Sad 4.1%
Confused 2.5%
Disgusted 2.2%
Fear 1.2%
Angry 1.1%

AWS Rekognition

Age 41-49
Gender Male, 100%
Calm 93.4%
Sad 3.1%
Confused 2.3%
Happy 0.5%
Surprised 0.2%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Confused 81.4%
Sad 7.3%
Happy 3.1%
Calm 2.5%
Angry 2.1%
Disgusted 1.7%
Surprised 1.2%
Fear 0.7%

AWS Rekognition

Age 20-28
Gender Male, 99.3%
Calm 90%
Happy 4.4%
Surprised 3%
Sad 2%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Tie 95.3%
Helmet 84%
Airplane 79.5%
Helicopter 76.5%

Categories

Text analysis

Amazon

MAC
ف
32A8
٢٤/٣١٣
.D.P.H.H ٢٤/٣١٣
.D.P.H.H

Google

MAC 32A6 YT37A2
MAC
32A6
YT37A2