Human Generated Data

Title

Draped Woman

Date

n.d.

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.134

Human Generated Data

Title

Draped Woman

Date

n.d.

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.134

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Bronze 100
Adult 99.4
Bride 99.4
Female 99.4
Person 99.4
Wedding 99.4
Woman 99.4
Figurine 92.8
Face 90.4
Head 90.4
Art 89.9
Sculpture 68.4
Statue 66.8
Archaeology 55.8
Armor 55.3

Clarifai
created on 2018-05-09

no person 97.8
art 97.6
one 95.5
sculpture 93.4
reptile 93.4
metalwork 90.8
illustration 90.2
wear 83
symbol 82.9
invertebrate 82
old 80.9
museum 80.6
science 80.5
side view 79.9
ancient 77.2
texture 77.1
retro 75.5
mammal 74
woman 73.8
amphibian 72.5

Imagga
created on 2023-10-07

knocker 100
device 96.2
old 28.6
antique 27.7
decoration 26
metal 24.9
ancient 21.6
gold 21.4
bolo tie 21.1
door 19
art 18.6
wood 17.5
vintage 17.4
ornament 17.2
bronze 16.8
necktie 16.7
sculpture 14.6
handle 14.3
brass 13.7
design 13.5
religion 13.4
decorative 13.4
wooden 13.2
amulet 12.5
retro 12.3
culture 12
head 11.8
architecture 11.7
traditional 11.6
statue 11.5
temple 11.4
golden 11.2
stone 11
face 10.7
entrance 10.6
style 10.4
religious 10.3
charm 10.1
aged 9.9
history 9.8
iron 9.3
holiday 9.3
detail 8.8
object 8.8
jewelry 8.6
garment 8.4
house 8.4
security 8.3
shield 8.1
steel 8
enter 7.8
travel 7.7
luxury 7.7
lock 7.7
god 7.7
ornamental 7.6
tooth 7.5
china 7.5
symbol 7.4
ornate 7.3
home 7.2

Google
created on 2018-05-09

Color Analysis

Feature analysis

Amazon

Adult 99.4%
Bride 99.4%
Female 99.4%
Person 99.4%
Woman 99.4%

Captions

Microsoft
created on 2018-05-09

a piece of wood 34.1%