Human Generated Data

Title

Nestoris (two-handled jar) with Mythological Scenes

Date

340-320 BCE

People

Artist: Choephoroi Painter, Greek 360 - 330 BCE

Classification

Vessels

Human Generated Data

Title

Nestoris (two-handled jar) with Mythological Scenes

People

Artist: Choephoroi Painter, Greek 360 - 330 BCE

Date

340-320 BCE

Classification

Vessels

Machine Generated Data

Tags

Amazon

Jar 99.6
Pottery 99.2
Urn 96.9
Vase 95.5
Bag 92.4
Handbag 92.4
Accessories 92.4
Accessory 92.4

Clarifai

no person 99.9
vase 99.5
pottery 99.5
ancient 99.4
container 99.3
art 99.3
urn 98.3
arts and crafts 98.2
antique 98.2
metalwork 98.1
interior design 98.1
decoration 97.3
isolate 97
one 96.7
traditional 96.1
ornate 95.9
clay 95.8
handmade 95
sculpture 94.8
jug 94.1

Imagga

china 83
ceramic ware 72.8
teapot 59.8
porcelain 59.2
utensil 58.1
pot 50.4
vessel 48.7
container 47.7
vase 44.1
tea 41.4
earthenware 39
drink 34.2
pottery 33.5
traditional 32.4
cup 31.9
object 29.3
jar 25.8
pitcher 25.5
old 24.4
antique 22.7
majolica 22.2
handle 21
beverage 20.9
ceramic 20.3
art 19.5
decoration 18.9
jug 18.1
retro 18
culture 17.9
water 16.7
ceramics 16.6
gold 16.4
kettle 15.8
single 15.6
ancient 15.6
glass 15.3
decorative 15
lid 14.7
brown 14.7
ornament 14.6
cooking utensil 14.4
close 14.3
hot 14.2
oriental 14.2
vintage 14.1
coffee 13.9
liquid 13
metal 12.9
ornate 12.8
clay 12.7
ceremony 12.6
craft 12.4
pattern 12.3
healthy 12
kitchen 11.6
decor 11.5
herbal 11.5
east 11.2
food 10.9
black 10.8
brew 10.8
flower 10.8
transparent 10.7
pour 10.7
yellow 10.6
mug 10.5
style 10.4
tradition 10.2
closeup 10.1
refreshment 10
boil 9.9
reflection 9.7
kitchen utensil 9.6
golden 9.5
color 9.5
design 9
cups 8.8
classical 8.6
elegant 8.6
old fashioned 8.6
cover 8.3
history 8
celebration 8
breakfast 8
copper 7.8
drinks 7.8
handmade 7.8
obsolete 7.7

Google

vase 90.1
artifact 84.9
pottery 82.6
ceramic 78.1
urn 71.2
porcelain 62.2
tableware 60.3
serveware 54.3

Microsoft

wall 98.7
indoor 87
plant 53.5
ceramic ware 31.9
decorative 30.5
painting 22.2
painted 20.2
porcelain 9.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 15-25
Gender Female, 53.5%
Disgusted 45.3%
Sad 45.1%
Happy 45.2%
Confused 45%
Angry 45.1%
Calm 54.1%
Surprised 45.2%

AWS Rekognition

Age 26-43
Gender Female, 54.2%
Happy 45.1%
Calm 50.3%
Sad 46.4%
Surprised 45.4%
Angry 46.1%
Disgusted 46.6%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Male, 51.3%
Disgusted 45%
Sad 45.1%
Calm 54.7%
Angry 45%
Surprised 45%
Confused 45%
Happy 45.1%

AWS Rekognition

Age 26-43
Gender Female, 53%
Confused 45.1%
Sad 45.3%
Disgusted 45.7%
Happy 45.5%
Calm 51.6%
Angry 45.5%
Surprised 46.3%

AWS Rekognition

Age 20-38
Gender Male, 54.5%
Disgusted 53.7%
Confused 45%
Angry 45.1%
Calm 46.1%
Happy 45%
Sad 45%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Sad 45.3%
Calm 46.1%
Disgusted 45.1%
Confused 45.1%
Surprised 45.3%
Happy 52.8%
Angry 45.3%

AWS Rekognition

Age 35-52
Gender Male, 54.8%
Sad 45.4%
Disgusted 45.7%
Angry 46.3%
Happy 49.9%
Calm 46.9%
Surprised 45.4%
Confused 45.3%

Microsoft Cognitive Services

Age 49
Gender Male

Microsoft Cognitive Services

Age 16
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Handbag 92.4%

Captions

Microsoft

a painting of a vase 63.4%
a painting of a vase on a table 59.6%
a vase sitting on a table 44.1%

Text analysis

Amazon

St