Human Generated Data

Title

FRAGMENT OF VASE

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of James Loeb, 1910.17

Human Generated Data

Title

FRAGMENT OF VASE

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of James Loeb, 1910.17

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Accessories 98
Bronze 90.6
Person 67.7
Jewelry 60.8
Gemstone 56.4
Arrow 56.2
Weapon 56.2
Arrowhead 55.8

Clarifai
created on 2018-05-09

art 98.8
no person 97.9
one 97.7
invertebrate 95.8
jewelry 95
metalwork 94.9
still life 94.4
gold 93.7
texture 92.3
vintage 90
container 89.9
wear 89.8
decoration 88.7
sculpture 88.3
luxury 88.3
people 88.3
desktop 87.6
medicine 86.8
cutout 85.5
grow 85.2

Imagga
created on 2023-10-07

grenade 91.8
bomb 73.3
explosive device 55.6
weaponry 36.4
device 25.6
container 21.9
gold 15.6
close 15.4
metal 13.7
jewelry 13.6
ring 13.3
old 13.2
object 12.5
decoration 12.4
box 11.9
food 11.6
antique 11.5
golden 11.2
money 11.1
jewel 10.6
black 10.2
earthenware 9.9
gem 9.7
chest 9.5
gemstone 8.8
gift 8.6
shell 8.2
brown 8.1
closeup 8.1
celebration 8
day 7.8
precious 7.8
ornament 7.8
stone 7.7
treasure 7.7
luxury 7.7
chocolate 7.7
finance 7.6
vase 7.5
ceramic ware 7.5
vintage 7.4
tradition 7.4
shiny 7.1

Google
created on 2018-05-09

artifact 57.5
jewellery 52.1
metal 52

Microsoft
created on 2018-05-09

Color Analysis

Feature analysis

Amazon

Person 67.7%

Categories

Imagga

nature landscape 48.3%
food drinks 25.4%
paintings art 14.8%
pets animals 8.8%

Captions

Microsoft
created on 2018-05-09

a piece of wood 54.1%
a piece of paper 39.3%
a gold and black 39.2%