Human Generated Data

Title

Bird-shaped jar

Date

Banshan phase, c. 2650-2300 BCE

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Partial gift of the Walter C. Sedgwick Foundation and partial purchase through the Ernest B. and Helen Pratt Dane Fund for Asian Art, 2006.170.34

Human Generated Data

Title

Bird-shaped jar

People
Date

Banshan phase, c. 2650-2300 BCE

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2020-01-23

Jar 99
Pottery 99
Vase 99
Animal 91.3
Sea Life 91.3
Turtle 91.3
Reptile 91.3

Clarifai
created on 2020-01-23

pottery 100
vase 99.9
clay 99.8
no person 99.4
art 99.1
ceramic 99
one 99
sculpture 98.9
earthenware 98.8
container 98.6
still life 98.2
ancient 97.6
porcelain 96.6
pot 96.6
urn 95.9
jug 95.9
museum 95.7
isolate 95.5
old 95.5
jar 95.3

Imagga
created on 2020-01-23

vase 100
jar 100
vessel 95
container 67.8
decoration 36.2
ornament 30.2
ball 27.1
holiday 25.8
celebration 24.7
china 24.4
decorative 23.4
ceramic ware 22.3
winter 22.1
earthenware 22
shiny 21.4
season 21
seasonal 21
glass 21
object 20.5
tradition 20.3
sphere 20.2
traditional 20
round 19.8
gold 18.9
decorate 18.1
festive 16.6
utensil 16.6
porcelain 16.4
celebrate 16.3
merry 16.2
balls 15.3
bauble 14.7
glitter 14.5
bright 12.9
ornate 12.8
snow 12.5
decor 12.4
ornamental 12.3
pot 11.8
brown 11.8
art 11.7
ornaments 11.6
design 11.3
shine 11
antique 10.4
golden 10.3
pottery 9.8
new 9.7
close 9.7
craft 9.5
gift 9.5
symbol 9.4
ribbon 9.3
tree 9.2
teapot 9.2
year 9.1
food 9.1
one 9
color 8.9
closeup 8.8
decorations 8.7
yellow 8.6
tea 8.5
flower 8.5
old 8.4
globe 8.3
event 8.3
nobody 7.8
party 7.7
culture 7.7
focus 7.4
single 7.4
present 7.3
world 7.1
vibrant 7

Google
created on 2020-01-23

earthenware 97.9
Ceramic 92.9
Pottery 91.8
Artifact 88.4
Vase 86.9
Art 65.5
Urn 55.2

Microsoft
created on 2020-01-23

vase 98.3
indoor 89.6
art 85.6
pottery 84.8
ceramic ware 84.4
earthenware 79
still life photography 76.1
teapot 73.9
still life 68.6
ceramic 67.8
urn 67.4
design 61.4
plant 45.8
stoneware 31.9
porcelain 14.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Female, 98.5%
Fear 0.1%
Happy 10.2%
Sad 0.6%
Angry 1.8%
Disgusted 0.1%
Calm 86.9%
Confused 0.1%
Surprised 0.2%

Feature analysis

Amazon

Turtle 91.3%

Captions

Microsoft

a vase sitting on a table 47%
a vase sitting on top of a table 43.9%
a vase is sitting on a table 36.5%