Human Generated Data

Title

DARUMA AND OKAME

Date

-

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Helene K. Suermondt, 1973.15

Human Generated Data

Title

DARUMA AND OKAME

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Helene K. Suermondt, 1973.15

Machine Generated Data

Tags

Amazon
created on 2019-07-08

Ivory 90.6
Figurine 82.8

Clarifai
created on 2019-07-08

sculpture 97.6
no person 96.1
religion 94.6
group 92.7
desktop 92.6
art 91
people 88.7
still life 88.7
decoration 88.3
one 87.1
adult 86.8
statue 86.1
shape 85.9
figurine 85.4
face 84.9
food 83.6
color 83
daylight 82.7
two 82.3
cutout 82

Imagga
created on 2019-07-08

sand 49.4
soil 40.2
sculpture 39.7
carving 35.6
earth 31.9
art 22.7
statue 19
plastic art 17.4
religion 17
face 16.3
figure 16.1
brown 14
old 13.9
saltshaker 13.9
mask 13.2
culture 12.8
traditional 12.5
wood 11.7
shaker 11.1
decoration 11
head 10.9
close 10.8
wooden 10.6
container 10.4
corbel 10.3
religious 10.3
stone 10.1
portrait 9.7
covering 9.5
ancient 9.5
temple 9.5
object 8.8
shoes 8.6
leather 8.5
bracket 8.3
human 8.2
cute 7.9
studio 7.6
pair 7.6
east 7.5
monument 7.5
clothing 7.3
body 7.2
disguise 7.1
shoe 7

Google
created on 2019-07-08

Figurine 90.3
Carving 88.9
Stone carving 61.7
Nativity scene 61.6
Art 58.1
Clay 57.5
Statue 52.8

Microsoft
created on 2019-07-08

statue 90
sculpture 86.3
human face 78.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-7
Gender Female, 97.3%
Happy 90.9%
Sad 1.2%
Disgusted 0.5%
Angry 0.9%
Surprised 1.9%
Calm 2.6%
Confused 2.1%

Microsoft Cognitive Services

Age 3
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Categories

Imagga

paintings art 96.7%
pets animals 2.8%

Captions