Human Generated Data

Title

Head of a Woman

Date

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Miss Harriet S. Amory, 1920, 1977.216.3448

Human Generated Data

Title

Head of a Woman

People
Date

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Figurine 99.2
Head 98.9
Sculpture 95.5
Art 95.5
Archaeology 91.1
Bread 85.8
Food 85.8
Statue 73.3
Bronze 71.7

Imagga
created on 2022-01-29

sculpture 100
plastic art 100
bust 100
art 67.7
statue 54.8
figure 40.3
ancient 37.2
culture 32.5
stone 30.4
monument 29.9
face 29.1
old 28.6
head 28.6
history 25.1
religion 24.2
architecture 20.3
travel 19.7
tourism 16.5
temple 16.3
antique 15.6
carving 15.2
human 15
traditional 15
famous 14.9
historical 14.1
historic 13.8
tourist 13.6
marble 13.6
landmark 13.5
civilization 12.8
past 12.6
god 12.4
portrait 12.3
man 12.2
close 12
tomb 11.8
ruin 11.7
oriental 11.3
peace 11
sky 10.8
memorial 10.5
religious 10.3
east 10.3
meditation 9.6
decoration 9.5
megalith 9.3
place 9.3
male 9.2
park 9.1
grave 8.8
archeology 8.8
warrior 8.8
worship 8.7
china 8.5
body 8
nose 8
carved 7.8
roman 7.8
spiritual 7.7
great 7.7
classical 7.7
desert 7.5
symbol 7.4
structure 7.3
black 7.2

Google
created on 2022-01-29

Forehead 98.5
Nose 98.4
Cheek 97.9
Head 97.5
Statue 90.6
Sculpture 89.5
Jaw 88.3
Temple 87.3
Artifact 86
Art 84.5
Classical sculpture 74.3
Chest 70.7
Carving 70.4
Ancient history 68.6
Plaster 67.7
Rock 67
Visual arts 66.8
History 66.2
Nonbuilding structure 65.2
Stone carving 64.1

Microsoft
created on 2022-01-29

sculpture 98.4
statue 97.7
bust 81.8
artifact 71.2
human face 63.7
megalith 19.5
stone 6

Face analysis

Amazon

AWS Rekognition

Age 4-10
Gender Male, 79.4%
Angry 74.9%
Disgusted 12.1%
Confused 4.9%
Calm 3.3%
Sad 2.4%
Surprised 1%
Happy 0.9%
Fear 0.5%

Feature analysis

Amazon

Bread 85.8%

Captions

Microsoft

the face of a person 57.7%
a person looking at the camera 54%
a close up of a person 53.9%