Human Generated Data

Title

Red-figure Cup Fragment: feet of two dancers

Date

c. 470 BCE

People
Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Purchase through the generosity of Jonathan H. Kagan, Mr. and Mrs. Evangelos Karvounis, Ian M. Watson McLaughlin, Nicholas S. Zoullas and the Florence Gould Foundation, and the Alpheus Hyatt Purchasing, Director's Discretionary and Marian H. Phinney Funds, 1995.18.91

Human Generated Data

Title

Red-figure Cup Fragment: feet of two dancers

People
Date

c. 470 BCE

Classification

Fragments

Machine Generated Data

Tags

Amazon
created on 2022-05-28

Apparel 83.9
Clothing 83.9
Soil 78.1
Cushion 71.4
Slate 70
Weapon 59.3
Weaponry 59.3
Fossil 58.9

Imagga
created on 2022-05-28

chocolate 100
food 85.4
brown 34.6
wallet 26.4
nobody 22.5
delicious 22.3
snack 22.2
sweet 22.1
dessert 22.1
tasty 21.7
confectionery 21
sugar 20.6
close 20.5
candy 20.1
object 19
gourmet 18.7
money 17.9
leather 17.5
dark 16.7
meal 16.2
piece 16.1
eat 15.1
closeup 14.8
bakery 14.3
purse 14
black 14
fat 14
bread 13.9
cash 13.7
slice 13.6
currency 13.5
milk 13.3
healthy 13.2
pastry 13
diet 12.9
eating 12.6
cake 12.2
baked 12.2
finance 11.8
paper 11.8
container 11.6
objects 10.4
ingredient 10.3
tea 10
treat 9.5
wheat 9.5
bar 9.2
nutrition 9.2
fresh 9.1
toast 9.1
studio 9.1
business 9.1
wealth 9
color 8.9
swiss 8.8
loaf 8.7
personal 8.6
plate 8.5
product 8.4
horizontal 8.4
case 8.4
traditional 8.3
detail 8
financial 8
breakfast 8
paying 7.8
dairy 7.7
gift 7.7
old 7.7
whole 7.6
texture 7.6
savings 7.5
dollar 7.4
taste 7.4
present 7.3
open 7.2
wooden 7
cocoa 7

Google
created on 2022-05-28

Sleeve 87
Rectangle 80
Triangle 77.7
Wood 77
Food 74.4
Fashion accessory 67.2
Tie 64.3
Leather 62.7
Rock 60.9
Slope 60.2
Font 60.1
Carmine 59.9
Electric blue 59.8
Eyewear 58.9
Art 54.2

Microsoft
created on 2022-05-28

drawing 92.3
painting 76.1
sweet 74.8
art 71.3