Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.132

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.132

Machine Generated Data

Tags

Amazon
created on 2019-07-08

Fossil 96.9
Bread 89.6
Food 89.6
Plant 79.7
Biscuit 59.9
Cookie 59.9
Wood 58.8
Rock 56.3

Clarifai
created on 2019-07-08

food 98.1
no person 96.9
sweet 96.7
desktop 96.5
closeup 93.3
pastry 90.7
traditional 90
delicious 89.7
stranded 88
refreshment 87.5
bakery 87.3
chocolate 87.3
breakfast 86.3
tasty 85.4
wood 85.1
old 84.4
sugar 84.2
gold 84.1
color 83.5
cake 81.9

Imagga
created on 2019-07-08

stamp 73.4
die 60.2
shaping tool 43.8
nut and bolt 36.7
fastener 32.4
tool 28.5
birdhouse 24.7
restraint 24.4
device 20.8
shelter 19.7
food 16.9
decoration 16.9
protective covering 16.5
brown 15.4
object 15.4
sweet 12.6
gold 12.3
close 12
corbel 11.9
old 11.8
closeup 11.4
dessert 10.6
slice 10
bracket 10
amulet 9.9
texture 9.7
pastry 9.5
paper 9.4
snack 9.4
grunge 9.4
season 9.3
nobody 9.3
holiday 9.3
delicious 9.1
metal 8.8
natural 8.7
ornament 8.6
golden 8.6
bakery 8.6
piece 8.5
cake 8.5
dark 8.3
bread 8.3
single 8.2
symbol 8.1
celebration 8
wooden 7.9
seasonal 7.9
charm 7.8
black 7.8
ancient 7.8
support 7.7
culture 7.7
eating 7.6
nutrition 7.5
eat 7.5
wood 7.5
traditional 7.5
festive 7.4
light 7.3
spice 7.3
color 7.2
cut 7.2
antique 7.1
steel 7.1
design 7

Google
created on 2019-07-08

Wood 69.2
Tree 68.7
Cuisine 59.8
Plant 58.5
Food 58.4
Baked goods 51.1

Microsoft
created on 2019-07-08

bread 32.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 49-69
Gender Male, 95.8%
Sad 5.5%
Disgusted 0.4%
Confused 4.6%
Angry 9.5%
Happy 2.2%
Surprised 3.4%
Calm 74.4%

Feature analysis

Amazon

Bread 89.6%

Categories

Imagga

paintings art 99.9%

Captions