Human Generated Data

Title

Untitled (two women standing)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20250

Human Generated Data

Title

Untitled (two women standing)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.5
Apparel 99.5
Human 98.9
Person 98.9
Person 96.3
Indoors 84.5
Room 84.5
Hat 69.3
Bathroom 67.1
Bonnet 59.1
Female 58
People 55.7

Imagga
created on 2022-03-05

lab coat 100
coat 86.2
nurse 57.1
garment 45.3
clothing 40.2
man 29.6
people 27.9
professional 25.8
person 23.6
adult 23.5
male 23.4
medical 21.2
work 18.8
doctor 18.8
worker 18
patient 16.4
business 16.4
portrait 15.5
hospital 15.3
health 15.3
happy 15
clothes 15
profession 14.4
job 14.1
businessman 14.1
men 13.7
clinic 13.7
medicine 13.2
indoors 13.2
corporate 12.9
occupation 12.8
women 12.6
day 11.8
team 11.6
bright 11.4
smile 11.4
20s 11
suit 11
happiness 11
full length 10.7
daytime 10.6
laboratory 10.6
adults 10.4
looking 10.4
building 10.3
black 10.2
focus 10.2
smiling 10.1
uniform 10
face 9.9
days 9.8
attractive 9.8
working 9.7
lab 9.7
emotion 9.2
window 9.2
student 9.1
care 9.1
human 9
scientist 8.8
research 8.6
commerce 8.4
pretty 8.4
success 8
mask 8
home 8
lifestyle 7.9
standing 7.8
education 7.8
emotions 7.8
chemistry 7.7
windows 7.7
blurred 7.7
casual 7.6
hand 7.6
room 7.5
surgeon 7.5
blur 7.4
holding 7.4
inside 7.4
office 7.2
bathrobe 7.2
family 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 99.8
indoor 94.2
text 94
drawing 92
human face 91.6
bathroom 91.1
person 87.9
black and white 82.1
sketch 77.6
smile 74.3
clothing 70
man 53.4
posing 35.2

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Calm 39.7%
Surprised 37.2%
Happy 14.9%
Confused 2.9%
Sad 2.3%
Disgusted 1.2%
Angry 1%
Fear 0.8%

AWS Rekognition

Age 49-57
Gender Male, 88.6%
Happy 91.5%
Surprised 3.7%
Calm 3.6%
Fear 0.7%
Disgusted 0.2%
Sad 0.1%
Confused 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 90%
a person standing in front of a mirror posing for the camera 78%
a man and woman standing in front of a mirror posing for the camera 64.1%

Text analysis

Amazon

200

Google

NAGO
T3RA2=
T3RA2= NAGO