Human Generated Data

Title

Unfired Clay Plaque in the form of a Head

Date

-

People

-

Classification

Plaques

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, First Fogg Expedition to China (1923-1924), 1924.65.32.H

Human Generated Data

Title

Unfired Clay Plaque in the form of a Head

Classification

Plaques

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, First Fogg Expedition to China (1923-1924), 1924.65.32.H

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Figurine 98.4
Head 94.4
Archaeology 89.6
Sculpture 88.2
Art 88.2
Statue 75.1
Person 57.4
Human 57.4

Clarifai
created on 2019-07-07

no person 98.9
one 97.4
sculpture 96
clay 94.4
face 86.9
old 86.9
stone 85.7
nature 85.6
rock 85.6
art 84.1
portrait 82.4
ancient 81
desktop 80.4
cold 79.5
winter 79.4
mammal 76.9
snow 75.7
man 74.4
decoration 74.2
closeup 73.8

Imagga
created on 2019-07-07

corbel 46.7
bracket 37.3
device 36
sculpture 33.8
statue 28.5
support 26.9
knocker 23.2
ancient 21.6
culture 21.4
art 20
head 18.5
old 18.1
religion 17.9
close 17.7
face 16.3
stone 16
temple 14.2
gold 14
history 13.4
god 13.4
human 12.7
cone 12.2
east 12.1
famous 12.1
mask 12
bust 11.8
traditional 11.6
antique 11.5
brown 11
travel 10.6
china 10
covering 10
ruler 10
anatomy 9.7
meditation 9.6
decoration 9.5
oriental 9.4
food 9.3
historic 9.2
peace 9.1
tourism 9.1
symbol 8.7
wisdom 8.7
plastic art 8.5
harmony 8.4
black 8.4
monument 8.4
wood 8.3
closeup 8.1
detail 8
body 8
architecture 7.8
carving 7.8
animal 7.8
spirituality 7.7
spiritual 7.7
dark 7.5
bronze 7.5
religious 7.5
invertebrate 7.3
object 7.3
wooden 7

Google
created on 2019-07-07

Sculpture 96.4
Face 95.8
Head 92.7
Stone carving 88.1
Carving 86.6
Forehead 86.3
Figurine 84.7
Nose 84.2
Chin 82.8
Statue 80.8
Art 76.1
Artifact 75.4
Jaw 75
Ceramic 61.8
Rock 54.2
Clay 52.9
Nonbuilding structure 51.4

Microsoft
created on 2019-07-07

statue 98.6
sculpture 98
human face 96.4
bust 80.2
artifact 69
old 54.7
ancient 53.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-66
Gender Female, 98.5%
Angry 4.7%
Calm 64.7%
Disgusted 1.7%
Happy 18.5%
Sad 3.4%
Surprised 2.9%
Confused 4.2%

Microsoft Cognitive Services

Age 59
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 57.4%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2019-07-07

a close up of a mans face 60.2%
the face of a person 48.7%
a close up of a person 48.6%