Human Generated Data

Title

Earring

Date

-

People

-

Classification

Jewelry

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.144

Human Generated Data

Title

Earring

Classification

Jewelry

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.144

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Arrowhead 82.5
Art 60.9
Painting 60.9
Gold 58.4
Money 55.3

Clarifai
created on 2019-07-07

no person 96.6
symbol 95.1
one 94.2
disjunct 91.7
people 89.5
shining 88.7
luxury 86.2
nature 85.4
desktop 84.2
wear 81.1
jewelry 79.9
abstract 78.6
wealth 78.4
money 78
education 74.9
gold 73.6
religion 73.4
art 72.6
shape 70.4
woman 70.3

Imagga
created on 2019-07-07

shoe 42
footwear 35.6
clothing 34.3
covering 33.4
mask 31.7
shoes 27.8
pair 24.5
foot 22.8
lace 21.7
black 20.6
snowshoe 19.4
leather 19
device 18
finger 17.4
sport 16.6
boot 15.8
disguise 15.5
rubber 15.4
running shoe 14.7
boots 13.6
close 13.1
old 12.5
wear 12.5
shoelace 11.8
face 11.4
eyes 11.2
knee pad 10.9
attire 10.8
laces 10.8
sole 10.8
new 10.5
walking 10.4
dirty 9.9
religion 9.8
object 9.5
equipment 9.3
head 9.2
dark 9.2
fashion 9
human 9
protective garment 8.7
feet 8.7
fear 8.7
water 8.7
men 8.6
culture 8.5
two 8.5
sports 8.3
consumer goods 8.3
sneakers 7.9
jogging 7.9
work 7.8
industry 7.7
athletic 7.6
wood 7.5
style 7.4
brown 7.4
fitness 7.2
sandal 7.2

Google
created on 2019-07-07

Artifact 68.4
Carving 68.3
Space 56.6
Fashion accessory 54.6
Pendant 52
Metal 51.1
Art 50.2

Microsoft
created on 2019-07-07

dark 34.2

Color Analysis

Face analysis

Microsoft

Google

Microsoft Cognitive Services

Age 54
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 60.9%

Categories

Imagga

paintings art 86.1%
food drinks 12.6%

Captions

Microsoft
created on 2019-07-07

a close up of a light 26.3%