Human Generated Data

Title

Ewer in the Form of a Dancing Lady Holding a Fruiting Peach Branch, the Stopper in the Form of a Bun of Hair

Date

probably mid 16th century

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Samuel C. Davis, 1940.228.A-B

Human Generated Data

Title

Ewer in the Form of a Dancing Lady Holding a Fruiting Peach Branch, the Stopper in the Form of a Bun of Hair

Date

probably mid 16th century

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Samuel C. Davis, 1940.228.A-B

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Figurine 99.8
Clothing 96.9
Helmet 96.9
Apparel 96.9
Porcelain 79.7
Pottery 79.7
Art 79.7
Human 75.1
Person 75.1
Worship 59.6
Buddha 56.7

Clarifai
created on 2019-07-06

figurine 97.8
traditional 96.8
one 95.9
wear 95.8
art 95.6
sculpture 94.7
people 94.6
doll 94.6
Christmas 94.1
no person 93.6
adult 93.5
costume 93.2
winter 92.7
woman 92.3
lid 92.1
toy 91.1
veil 88.9
man 88.1
celebration 86.1
decoration 85.4

Imagga
created on 2019-07-06

dancer 48.3
performer 46.2
entertainer 35.6
sculpture 29.6
costume 27
statue 26.6
art 23.5
religion 23.3
culture 23.1
person 21.5
traditional 18.3
temple 18.1
oriental 17.3
ancient 16.4
automaton 15.7
gold 15.6
tradition 14.8
god 14.4
peace 13.7
plaything 13.2
armor 12.3
man 12.2
golden 12
figure 12
fun 12
spirituality 11.5
doll 11.5
colorful 11.5
face 11.4
toy 11
decoration 10.9
holiday 10.8
male 10.6
worship 10.6
color 10.6
bust 10.6
human 10.5
east 10.3
head 10.1
hat 10
cartoon 9.8
warrior 9.8
old 9.8
spiritual 9.6
religious 9.4
travel 9.2
suit 9
history 8.9
people 8.9
celebration 8.8
meditation 8.6
present 8.2
symbol 8.1
metal 8
clothing 7.9
comedian 7.8
body armor 7.8
portrait 7.8
winter 7.7
card 7.7
power 7.6
happy 7.5
monument 7.5
tourism 7.4
mask 7.2
body 7.2
season 7

Google
created on 2019-07-06

Figurine 96.2
Statue 93.3
Porcelain 90.8
Toy 84.6
Ceramic 83.7
Sculpture 76.9
Geisha 67.5
Kimono 64.7
Art 58.1
Costume 50.3

Microsoft
created on 2019-07-06

statue 82.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-5
Gender Female, 64.4%
Surprised 17%
Disgusted 4%
Calm 40.1%
Angry 8.8%
Happy 6.8%
Sad 5.6%
Confused 17.7%

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Helmet 96.9%
Person 75.1%

Categories

Imagga

paintings art 69.1%
food drinks 27.2%
macro flowers 3%

Captions

Microsoft
created on 2019-07-06

a close up of a vase 33.9%

Text analysis

Google

a
a