Human Generated Data

Title

Bottle

Date

Majiayao phase, c. 3300-2650 BCE

People
Classification

Vessels

Human Generated Data

Title

Bottle

People
Date

Majiayao phase, c. 3300-2650 BCE

Classification

Vessels

Machine Generated Data

Tags

Amazon

Vase 97.8
Jar 97.8
Pottery 97.8
Urn 68.4

Clarifai

pottery 100
clay 99.7
vase 99.7
no person 99.5
earthenware 99.4
container 99.4
jug 99.2
ceramic 99
porcelain 98.3
art 97.2
handmade 96.8
carafe 96.8
one 96.8
traditional 95.5
pot 94.8
isolate 94.5
ancient 94
souvenir 93.4
decoration 93.4
jar 92.8

Imagga

vase 100
jar 100
vessel 92.5
container 76.8
earthenware 31
ceramic ware 28.2
bottle 22.7
glass 22.3
utensil 21.3
object 18.3
china 17.4
liquid 17.4
porcelain 16.9
drink 15.9
traditional 15.8
craft 15.3
decoration 15.2
decorative 15
jug 13.9
ancient 13.8
culture 13.7
art 13.7
clay 13.7
pot 13.5
brown 13.2
pottery 13.1
pitcher 12.3
old 11.8
food 11.5
beverage 10.8
ceramics 10.8
transparent 10.7
water 10.7
human 10.5
antique 10.4
anatomy 9.7
black 9.6
ornament 9.5
color 9.4
closeup 9.4
yellow 9.3
alcohol 9.2
vintage 9.1
terracotta 8.9
restaurant 8.6
classical 8.6
nobody 8.5
fresh 8.5
wine 8.3
ornate 8.2
science 8
close 8
body 8
medical 7.9
medicine 7.9
design 7.9
3d 7.7
handle 7.6
biology 7.6
style 7.4
single 7.4
plastic 7.3
history 7.2
wooden 7

Google

earthenware 97.6
Vase 96.7
Ceramic 94.2
Pottery 88.8
Artifact 85.4
Art 62.5
Urn 52.8
Interior design 52.6

Microsoft

vase 93.7
ceramic ware 87.2
statue 83.5
sculpture 79.3
vessel 72.7
plant 71.5
pottery 65.6
earthenware 55.4
stoneware 37.4
jar 30
porcelain 25.7
cup 23.4

Face analysis

Amazon

AWS Rekognition

Age 14-25
Gender Female, 88.2%
Angry 3.5%
Disgusted 0.3%
Happy 0.8%
Confused 2.4%
Sad 13%
Calm 77.9%
Surprised 1.9%

Captions

Microsoft

a vase sitting on a table 38.3%
a vase sitting on top of a table 36.1%
a close up of a vase 36%

Text analysis

Amazon

O