Human Generated Data

Title

HEAD

Date

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, David M. Robinson Fund through the Estate of Therese Kuhn Straus, 1979.400

Human Generated Data

Title

HEAD

People
Date

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Head 100
Sculpture 98.6
Art 98.6
Statue 98.3
Person 93.8
Human 93.8

Imagga
created on 2022-06-10

bust 100
plastic art 100
sculpture 100
art 68.2
figure 44.6
statue 36.1
head 28.6
face 28.4
man 25.6
portrait 24
ancient 23.4
old 20.9
male 19.2
close 18.9
human 17.3
culture 17.1
person 16.6
marble 16.1
stone 16.1
history 15.2
religion 14.4
monument 14
detail 13.7
adult 12.9
historic 12.9
people 12.8
expression 12.8
one 12.7
antique 12.1
black 12
eye 11.6
god 11.5
closeup 11.5
serious 11.5
museum 11.4
famous 11.2
style 11.1
eyes 10.3
travel 9.9
roman 9.8
looking 9.6
faith 9.6
hair 9.5
men 9.5
decoration 9.4
architecture 9.4
model 9.3
success 8.9
nose 8.7
meditation 8.6
vintage 8.3
gold 8.2
symbol 8.1
handsome 8
boy 7.8
pray 7.8
elegant 7.7
attractive 7.7
skin 7.6
oriental 7.6
religious 7.5
mature 7.4
tourism 7.4
artwork 7.3
peace 7.3
look 7

Google
created on 2022-06-10

Forehead 98.5
Face 98.4
Hair 98.3
Cheek 97.9
Eyebrow 94.1
Eye 93.8
Sculpture 88.6
Statue 88.4
Jaw 88.2
Temple 87.1
Gesture 85.3
Artifact 84.2
Style 83.8
Black-and-white 83.4
Art 82.1
Classical sculpture 79.8
Monument 79.3
Monochrome photography 73.5
Monochrome 71.1
Visual arts 69.4

Microsoft
created on 2022-06-10

text 95.3
statue 92.7
sketch 86.5
human face 86.4
sculpture 85.8
art 76.6
drawing 70.5
face 67.7
bust 52.2
posing 35.2

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 21-29
Gender Male, 99.8%
Angry 36.1%
Disgusted 22.7%
Confused 14.2%
Fear 12.3%
Surprised 7.1%
Sad 5.6%
Calm 5.3%
Happy 1%

Microsoft Cognitive Services

Age 29
Gender Male

Feature analysis

Amazon

Person 93.8%

Captions

Microsoft

an old photo of a man 82.3%
a man posing for a photo 77.5%
old photo of a man 77.4%