Human Generated Data

Title

Untitled (man standing at table holding up two ties while other guests laugh, Pennsylvania)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11662

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man standing at table holding up two ties while other guests laugh, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.3
Person 99.3
Person 98.9
Person 98.8
Person 98.6
Person 98.4
Person 98.2
Person 85
Clinic 82.4
Sunglasses 64.6
Accessories 64.6
Accessory 64.6
Priest 56.3
Sitting 56.1
Hospital 56

Imagga
created on 2022-01-15

person 48.2
man 45
nurse 39.9
male 38.3
teacher 32.8
people 31.2
adult 31
professional 29.9
classroom 28.1
office 27.7
patient 25.9
senior 25.3
happy 25.1
business 24.9
smiling 24.6
room 24.6
businessman 23.8
sitting 23.2
indoors 22.8
marimba 22.3
home 21.5
men 20.6
percussion instrument 20.3
desk 19.8
education 19
group 18.5
computer 18.4
student 18
table 17.3
laptop 17.3
case 17.3
musical instrument 16.9
working 16.8
mature 16.7
sick person 16.5
colleagues 16.5
work 16.5
together 15.8
smile 15.7
class 15.4
lifestyle 15.2
horizontal 15.1
meeting 15.1
school 14.9
couple 14.8
educator 14.5
teaching 13.6
board 13.6
team 13.4
elderly 13.4
talking 13.3
teamwork 13
executive 13
corporate 12.9
technology 12.6
worker 12.6
businesspeople 12.3
modern 11.9
women 11.9
casual 11.9
communication 11.8
happiness 11.7
half length 11.7
portrait 11.6
blackboard 11
suit 10.8
40s 10.7
chair 10.6
standing 10.4
looking 10.4
day 10.2
camera 10.2
indoor 10
specialist 9.9
diverse 9.8
casual clothing 9.8
students 9.7
to 9.7
job 9.7
medical 9.7
retired 9.7
retirement 9.6
adults 9.5
manager 9.3
inside 9.2
hand 9.1
businesswoman 9.1
holding 9.1
color 8.9
success 8.8
interior 8.8
mathematics 8.8
hands 8.7
workplace 8.6
college 8.5
learning 8.5
black 8.4
hospital 8.3
human 8.2
cheerful 8.1
handsome 8
70s 7.9
two people 7.8
barbershop 7.7
30s 7.7
looking camera 7.7
diversity 7.7
studying 7.7
old 7.7
clinic 7.7
doctor 7.5
study 7.5
phone 7.4
kid 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 98.9
text 95.5
window 94.3
clothing 91.4
man 84.7
human face 75.8
old 51.8

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 93.7%
Happy 96.8%
Calm 2.2%
Confused 0.4%
Sad 0.3%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Sad 93.7%
Happy 2.1%
Confused 2.1%
Calm 1.3%
Disgusted 0.3%
Fear 0.3%
Angry 0.2%
Surprised 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Sad 94.9%
Happy 2.2%
Confused 1%
Disgusted 0.5%
Angry 0.5%
Calm 0.5%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 49-57
Gender Male, 99.8%
Happy 88.6%
Surprised 4.8%
Calm 2.5%
Confused 1.8%
Disgusted 0.8%
Fear 0.8%
Angry 0.4%
Sad 0.3%

AWS Rekognition

Age 45-51
Gender Male, 100%
Sad 74.7%
Happy 15.5%
Confused 3.5%
Calm 2.2%
Angry 1.6%
Disgusted 1.5%
Surprised 0.6%
Fear 0.6%

AWS Rekognition

Age 26-36
Gender Male, 90.5%
Calm 91.2%
Sad 3.8%
Happy 2.8%
Confused 0.9%
Angry 0.5%
Surprised 0.4%
Disgusted 0.4%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Sunglasses 64.6%

Captions

Microsoft

a group of people sitting in front of a window 87.6%
a group of people sitting at a table in front of a window 87.2%
a group of people sitting at a table 87.1%

Text analysis

Amazon

9534
VCLV
NEW
VCLV EVLE1A NEW
9534.
EVLE1A

Google

9534.
YT37A2
9534. 534 YT37A2
534