Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.178

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.178

Machine Generated Data

Tags

Amazon
created on 2019-07-08

Animal 94
Reptile 94
Snake 94
Food 84.1
Bread 83.6
Arrowhead 71.6
Sweets 68.5
Confectionery 68.5
Cookie 61.1
Biscuit 61.1
Cracker 58.9

Clarifai
created on 2019-07-08

no person 98.7
one 97.6
food 96.9
jewelry 93.6
still life 87.7
grow 86.9
luxury 85.7
desktop 85
luck 84.1
cutout 82
two 80.6
color 79.6
gambling 78.7
indoors 76.9
simplicity 76.6
little 75
broken 74.5
leisure 73.7
stone 73.3
blur 72.2

Imagga
created on 2019-07-08

hole 64.8
ocarina 61.2
wind instrument 48.9
pencil sharpener 40.9
musical instrument 36.7
sharpener 33.7
close 18.8
game 16
dice 15.7
play 15.5
gamble 13.7
risk 13.4
leisure 13.3
gambling 12.6
luck 12.5
closeup 12.1
black 11.4
yellow 11.3
fun 11.2
old 11.1
objects 10.4
food 10.3
brown 10.3
two 10.2
wood 10
die 9.8
chance 9.8
cube 9.7
win 9.6
roll 9.5
snack 9.4
one 9
casino 8.8
color 8.3
texture 8.3
detail 8
holiday 7.9
money 7.7
health 7.6
pair 7.6
number 7.5
candy 7.3
wooden 7

Google
created on 2019-07-08

Beige 70.8
Finger food 64
Snack 56.4
Fashion accessory 54.6
Baked goods 53.8
Cookie 52.9
Food 51.7
Art 50.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 88.7%
Surprised 5.5%
Calm 13.5%
Sad 66.2%
Disgusted 1.5%
Confused 6.1%
Angry 3%
Happy 4.2%

Feature analysis

Amazon

Snake 94%

Categories

Imagga

pets animals 91.9%
food drinks 6.3%
paintings art 1.1%

Captions

Microsoft
created on 2019-07-08

a piece of wood 66%
a close up of a beach 65.9%
a piece of paper 49.4%