Human Generated Data

Title

Body Sherd: Hoof and Grape Vine

Date

530-500 BCE

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, David M. Robinson Fund, 2002.230.15

Human Generated Data

Title

Body Sherd: Hoof and Grape Vine

Date

530-500 BCE

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, David M. Robinson Fund, 2002.230.15

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Pottery 99
Accessories 93.2
Formal Wear 93.2
Tie 93.2
Bulldozer 92.9
Machine 92.9
Blackboard 70.5
Arrow 57.5
Weapon 57.5
Arrowhead 56.8

Clarifai
created on 2018-05-08

no person 98.5
one 97.3
wear 92.3
people 88.8
vintage 84.8
art 84.5
color 84.4
desktop 84
painting 81.6
container 81.2
old 79.4
triangle 77.7
building 77.2
paper 76.1
text 74.6
sign 74.5
cutout 74.1
lid 74.1
print 73.4
adult 72.2

Imagga
created on 2023-10-05

book 24
lampshade 17.9
container 17.2
paper 17.2
brown 16.9
gift 15.5
decoration 15
shade 14.9
food 14.4
brush 14.4
product 14.3
close 14.3
box 13.4
texture 13.2
covering 13
delicious 12.4
chocolate 12.3
protective covering 12
home 12
design 11.9
present 11.8
nobody 11.7
business 11.5
object 11
wooden 10.5
gourmet 10.2
closeup 10.1
decorative 10
tool 9.8
old 9.7
creation 9.6
pattern 9.6
tea 9.5
sweet 9.5
wallet 9.5
piece 9.5
sugar 9.4
finance 9.3
page 9.3
tasty 9.2
wood 9.2
new 8.9
binding 8.9
color 8.9
stamp 8.7
yellow 8.6
empty 8.6
paintbrush 8.5
card 8.5
pen 8.5
jigsaw puzzle 8.4
fat 8.4
gold 8.2
candy 8.1
open 8.1
detail 8
office 8
dessert 7.9
board 7.9
holiday 7.9
grunge 7.7
money 7.6
puzzle 7.6
bookmark 7.6
hand 7.6
dark 7.5
ribbon 7.4
retro 7.4
meal 7.3
slice 7.3
paint 7.2
colorful 7.2
bag 7

Google
created on 2023-10-30

Color Analysis

Feature analysis

Amazon

Bulldozer 92.9%
Blackboard 70.5%

Captions

Microsoft
created on 2018-05-08

a close up of a piece of paper 39.1%
a close up of an object 39%
a close up of a logo 38.9%