Human Generated Data

Title

Torso of a Female Statuette

Date

c. 2500 BCE-2300 BCE

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, The Lois Orswell Collection, 1998.246

Human Generated Data

Title

Torso of a Female Statuette

Date

c. 2500 BCE-2300 BCE

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, The Lois Orswell Collection, 1998.246

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Body Part 100
Torso 100
Person 91.2
Adult 91.2
Male 91.2
Man 91.2
Electronics 76.9
Hardware 76.9
Figurine 73.6
Hand 65.8
Back 57.5
Pottery 56.5
Weapon 55.5

Clarifai
created on 2023-11-01

one 99.5
no person 99.4
art 99
sculpture 98.9
people 98.7
monochrome 97.6
man 95.4
ancient 94.7
marble 93.4
portrait 92.1
stone 91.9
adult 91.8
old 91.5
still life 90
statue 89.6
woman 89.6
architecture 89.2
dirty 88.9
antique 87.8
eerie 87.7

Imagga
created on 2018-12-27

plug 17
fastener 16.5
hand 16.1
finger 16
ancient 15.6
sculpture 14.9
man 13.4
close 13.1
hands 13
statue 12.9
stone 12.6
old 12.5
baby 12.1
skin 12.1
restraint 11.7
device 10.8
art 10.6
latch 9.8
health 9.7
food 9.6
black 9.6
body 9.6
brown 9.6
love 9.5
natural 9.4
model 9.3
antique 8.8
healthy 8.8
catch 8.7
water 8.7
lifestyle 8.7
fingers 8.6
male 8.5
adult 8.4
head 8.4
rubber eraser 8.2
hole 8.2
dirty 8.1
person 8.1
dark 7.5
holding 7.4
wet 7.1
face 7.1
sand 7.1
architecture 7

Google
created on 2018-12-27

Microsoft
created on 2018-12-27

Color Analysis

Feature analysis

Amazon

Person 91.2%
Adult 91.2%
Male 91.2%
Man 91.2%

Categories

Imagga

paintings art 98.7%

Captions

Microsoft
created on 2018-12-27

a close up of a white wall 34.4%
a close up of a white background 34.3%
a close up of food 25.1%