Human Generated Data

Title

Fragment of Megarian ware

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2528

Human Generated Data

Title

Fragment of Megarian ware

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2528

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Bronze 99.5
Armor 79.6
Coin 70.2
Money 70.2
Fossil 60

Clarifai
created on 2019-03-26

no person 98.6
art 98.1
old 98
one 97.7
ancient 96.7
antique 95.8
round out 95.1
container 94.4
wear 94.1
vintage 94.1
retro 92.4
print 90.2
weapon 90
painting 89.2
museum 87.9
illustration 87.6
cutout 87.2
furniture 87
people 86.3
still life 86.3

Imagga
created on 2019-03-26

shield 37.2
chest 35.4
box 31
brown 30.9
armor 26.1
food 26.1
protective covering 23.8
container 22.5
breastplate 17.9
close 17.1
old 16
wood 15.8
diet 15.3
bread 14.8
loaf 14.6
bakery 14.4
armor plate 14.3
nutrition 14.3
fresh 13.7
antique 13.7
covering 13.3
wooden 13.2
slice 12.7
wheat 12.4
gold 12.3
healthy 12
plate 11.7
tasty 11.7
decoration 11.7
treasure 11.6
delicious 11.6
closeup 11.4
baked 11.2
seat 11.1
object 11
eat 10.9
furniture 10.8
chocolate 10.3
sweet 10.3
ottoman 10.2
money 10.2
fastener 10.1
earthenware 10.1
eating 10.1
organic 10.1
device 10
meal 9.7
piece 9.5
golden 9.5
natural 9.4
dark 9.2
cut 9
breakfast 8.8
ancient 8.6
whole 8.6
snack 8.5
pastry 8.5
vintage 8.3
full 8.2
wealth 8.1
restraint 8
hay 8
rural 7.9
shell 7.9
agriculture 7.9
trunk 7.7
crust 7.7
jewelry 7.7
finance 7.6
field 7.5
rich 7.4
dry 7.4
symbol 7.4
grain 7.4
countryside 7.3
yellow 7.3
metal 7.2
ceramic ware 7.2
currency 7.2

Google
created on 2019-03-26

Geology 70.2
Rock 54.2
Artifact 53.2

Microsoft
created on 2019-03-26

ceramic ware 23.2
museum 23.2
art 19.2
ancient 16.2
rock 5.2
ceramic 4.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 55.2%
Happy 11.4%
Surprised 1.9%
Angry 2.9%
Disgusted 0.7%
Confused 1%
Sad 19.6%
Calm 62.4%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2019-03-26

a close up of a box 50.6%
close up of a box 45.8%