Human Generated Data

Title

Tripod Pyxis: Sirens

Date

6th century BCE

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of David M. Robinson, 1960.289

Human Generated Data

Title

Tripod Pyxis: Sirens

Date

6th century BCE

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of David M. Robinson, 1960.289

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Pottery 94.8
Bowl 84
Pot 77.4

Clarifai
created on 2019-07-07

container 99.8
lid 97.8
clay 97.6
pot 97
old 96.6
art 95.7
pottery 95.5
no person 95.2
vintage 94.7
ancient 93.1
box 90.9
wood 90.8
desktop 90.1
decoration 89.7
retro 89.3
craft 89.2
vase 89
antique 88.7
isolate 88.7
handmade 87.9

Imagga
created on 2019-07-07

earthenware 99.7
ceramic ware 77.6
utensil 54.8
majolica 35.5
cup 35
container 30.6
pottery 20.2
brown 19.1
pot 17.8
drink 17.5
old 16.7
vessel 16.4
vase 16.2
object 16.1
ceramic 15.5
kitchen 15.2
bowl 14.6
decoration 14.5
beverage 13.8
money 13.6
wood 13.3
clay 12.7
close 12.6
art 12.5
traditional 12.5
tea 12.3
closeup 12.1
food 12.1
metal 12.1
single 11.5
antique 10.8
coins 10.6
wooden 10.5
empty 10.3
culture 10.3
china 10.1
cash 10.1
jug 9.9
saucer 9.7
nobody 9.3
finance 9.3
wealth 9
currency 9
financial 8.9
color 8.9
mug 8.9
breakfast 8.8
seat 8.8
objects 8.7
furniture 8.6
craft 8.6
coffee 8.3
banking 8.3
vintage 8.3
footstool 8.2
style 8.2
business 7.9
ceramics 7.8
handmade 7.8
pound 7.7
luxury 7.7
loan 7.7
tableware 7.6
economy 7.4
bread 7.4
investment 7.3
diet 7.3
bank 7.2
life 7

Google
created on 2019-07-07

earthenware 95.3
Pottery 80.1
Ceramic 76.3
Flowerpot 63.4
Art 58.1

Microsoft
created on 2019-07-07

vase 93.9
earthenware 93.7
bowl 92.9
pottery 91.5
ceramic 90.1
tableware 85.6
porcelain 76.4
ceramic ware 67.5
museum 67
mixing bowl 65.7
jug 65.6
art 62.6
kettle 59.5
plant 55.6
archaeology 52.4
pitcher 52.3
stoneware 18.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 9-14
Gender Male, 53.7%
Happy 5.6%
Confused 10.1%
Angry 9.1%
Disgusted 3.5%
Surprised 7%
Calm 19.9%
Sad 44.7%

Categories

Imagga

food drinks 94.1%
interior objects 5.2%

Captions

Microsoft
created on 2019-07-07

a cup of coffee 34.1%
a cup of coffee on a table 25.9%