Human Generated Data

Title

Eye Cup

Date

c. 530 BCE

People
Classification

Vessels

Human Generated Data

Title

Eye Cup

People
Date

c. 530 BCE

Classification

Vessels

Machine Generated Data

Tags

Amazon

Plant 89.4
Pottery 73.3
Food 66.8
Vegetable 66.8
Piggy Bank 62.7

Clarifai

one 97.2
face 97.1
no person 92.1
portrait 91.4
head 89.7
eye 89.2
mammal 88.3
art 88
desktop 87.9
vintage 83.7
cutout 81.7
color 80.2
animal 79.6
mouth 78.9
decoration 78.6
pottery 76.6
still life 74.8
lid 74.7
old 74.7
studio 74.6

Imagga

mask 52.8
covering 44.9
disguise 40.3
container 38.7
teapot 36.1
pot 35.3
attire 27.1
vessel 25.2
spindle 23
cup 20.4
stick 17.9
decoration 17.5
clothing 17.2
traditional 16.6
object 16.1
celebration 13.5
utensil 13
close 12.5
ornament 12.1
bank 11.6
holiday 11.5
face 11.4
earthenware 11.1
clip 10.6
piggy 10.6
art 10.5
old 10.4
brown 10.3
symbol 10.1
wealth 9.9
fastener 9.9
antique 9.8
design 9.6
objects 9.6
purse 9.5
color 9.4
gold 9
ancient 8.6
culture 8.5
money 8.5
food 8.4
vase 8.4
tradition 8.3
toy 8.3
animal 7.9
ball 7.9
black 7.8
season 7.8
jewel 7.7
bag 7.6
savings 7.4
closeup 7.4
pottery 7.2
shiny 7.1

Google

product design 63.4
ceramic 59.3
teapot 56.3
pottery 56.3
artifact 55.8
copper 55.6
masque 54.3
mask 52.6

Microsoft

cup 61
kylix 50.2
tableware 48.8

Face analysis

Amazon

AWS Rekognition

Age 26-44
Gender Male, 76.6%
Happy 2.5%
Surprised 1.8%
Angry 2.3%
Sad 2.6%
Confused 4.3%
Disgusted 0.4%
Calm 86.1%

Captions

Microsoft

a close up of a painted wall 49.8%
a cup of coffee 19.2%
a close up of a cup 19.1%