Human Generated Data

Title

Ex-voto Head

Date

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Pfeiffer-Hartwell Collection, 1977.216.238

Human Generated Data

Title

Ex-voto Head

People
Date

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Head 100
Art 98.2
Sculpture 98.2
Statue 92.4
Person 84.1
Human 84.1
Photography 56.3
Photo 56.3
Portrait 56.3
Face 56.3
Archaeology 55.3

Imagga
created on 2022-06-10

cemetery 38.1
plastic bag 37.3
face 30.5
child 30
bag 29.4
portrait 27.2
head 23.5
container 22.2
person 19.9
hair 19.8
eyes 18.9
old 18.1
people 17.3
man 16.1
mask 15.8
black 14.5
adult 14.2
senior 14.1
cadaver 13.8
human 13.5
male 13.5
health 13.2
attractive 12.6
smile 12.1
skin 12.1
covering 12
sculpture 12
clothing 11.8
body 11.2
expression 11.1
model 10.9
spa 10.8
statue 10.4
close 10.3
blond 10.2
water 10
horror 9.7
scary 9.7
happy 9.4
care 9.1
one 9
evil 8.8
sand 8.7
elderly 8.6
disguise 8.5
art 8.3
stone 8.2
eye 8
earth 8
looking 8
cap 8
spooky 7.8
bust 7.8
wisdom 7.8
retired 7.8
death 7.7
pretty 7.7
attire 7.7
retirement 7.7
serious 7.6
teeth 7.6
fashion 7.5
mature 7.4
purity 7.4
shower cap 7.4
emotion 7.4
lady 7.3
aged 7.2
religion 7.2
women 7.1
medical 7.1
look 7

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

sketch 96.6
human face 95.6
drawing 94.9
text 94.5
statue 84.8
white 79
art 77.4
old 75.7
person 71.1
black and white 69.4
sculpture 52.6

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-36
Gender Male, 89.5%
Calm 53.8%
Confused 33.4%
Surprised 10.8%
Fear 5.9%
Sad 3.7%
Disgusted 1%
Angry 0.5%
Happy 0.2%

Microsoft Cognitive Services

Age 29
Gender Female

Feature analysis

Amazon

Person 84.1%

Captions

Microsoft

a vintage photo of a person 73.3%
a vintage photo of a person 71.6%
an old photo of a person 71.5%