Human Generated Data

Title

Pelike depicting Helen and Paris

Date

420-400 BCE

People

Artist: The Painter of Louvre G 539,

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.46

Human Generated Data

Title

Pelike depicting Helen and Paris

People

Artist: The Painter of Louvre G 539,

Date

420-400 BCE

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Jar 99.1
Pottery 98.4
Vase 97.2
Urn 91.8
Plant 56.8
Potted Plant 56.8

Clarifai
created on 2018-02-19

pottery 99.9
no person 99.9
vase 99.7
container 99.7
antique 99.6
ancient 99.6
retro 99.6
art 99.5
clay 99.5
jug 99.3
old 98.8
vintage 98.6
arts and crafts 97.7
traditional 97.7
one 97.5
decoration 97.4
handmade 97.2
military 97.1
isolate 96.3
painting 95

Imagga
created on 2018-02-19

vessel 100
pitcher 100
container 100
vase 39.6
jug 37.3
traditional 31.6
tea 28.2
pot 28.1
drink 27.6
object 26.4
teapot 26.1
handle 25.8
cup 23.6
china 23
jar 22.6
decoration 22.4
old 22.3
culture 22.2
art 20.8
pottery 20.8
antique 19.9
ceramic 19.4
glass 16.3
decorative 15.9
bottle 15.8
ceramics 15.7
ornate 15.6
ancient 14.7
clay 14.6
beverage 14.5
craft 14.3
kitchen 14.3
brown 14
single 14
retro 13.9
earthenware 12.9
utensil 12.9
vintage 12.4
black 12
yellow 11.9
pattern 11.6
gold 11.5
breakfast 11.5
food 11.5
close 11.4
water 11.3
oriental 11.3
ornament 11.2
coffee 11.1
lid 10.8
porcelain 10.6
color 10.6
classical 10.5
design 10.1
flower 10
kettle 9.9
decor 9.7
ceremony 9.7
handmade 9.7
mug 9.6
liquid 9.6
closeup 9.4
hot 9.2
history 9
metal 8.8
brew 8.8
empty 8.6
leaf 8.6
tradition 8.3
style 8.2
transparent 8.1
celebration 8
texture 7.6
herbal 7.6
old fashioned 7.6
healthy 7.6
east 7.5
one 7.5
classic 7.4
heat 7.4

Google
created on 2018-02-19

pottery 76.5
artifact 74.5
ceramic 70.5
vase 69.6
jug 62.9
serveware 55.9

Microsoft
created on 2018-02-19

wall 99
vessel 97
indoor 91.4
plant 83.7
black 71.6
painted 38.7
jar 31.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 50.3%
Confused 45.3%
Surprised 45.4%
Disgusted 45.4%
Calm 47.6%
Sad 46.8%
Happy 45.2%
Angry 49.3%

AWS Rekognition

Age 15-25
Gender Male, 50.9%
Angry 47%
Happy 45.1%
Calm 46.7%
Sad 45.2%
Confused 45.4%
Surprised 45.3%
Disgusted 50.3%

Captions

Microsoft

a vase sitting on a table 68.7%
a vase is sitting on a table 59.1%
a black vase sitting on a table 58.3%

Text analysis

Amazon

UUUUUUUVUUU