Human Generated Data

Title

Standing Woman

Date

400-300 BCE

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.104

Human Generated Data

Title

Standing Woman

Date

400-300 BCE

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.104

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Archaeology 100
Art 99.9
Painting 99.9
Figurine 97.8
Person 95.7
Person 91.6
Adult 91.6
Bride 91.6
Female 91.6
Wedding 91.6
Woman 91.6
Head 80.7
Face 73.6
Architecture 55.6
Building 55.6
Monastery 55.6

Clarifai
created on 2018-05-09

one 99.4
no person 99.3
sculpture 98.1
people 96.1
art 95.7
adult 95.3
ancient 92.6
painting 91
biology 89.4
side view 89
man 87.9
two 87.7
wear 87.3
woman 87
print 86.6
illustration 84.4
antique 83.4
museum 81.4
statue 80.9
old 80.6

Imagga
created on 2023-10-07

sketch 100
drawing 100
representation 100
water 20
close 14.3
transparent 13.4
texture 13.2
clear 13.1
clean 12.5
splash 12.2
closeup 12.1
natural 12
old 11.8
fresh 11.8
cool 11.5
liquid 11.3
bubble 11.3
detail 11.2
drop 10.9
wet 10.7
backdrop 9.1
religion 9
pattern 8.9
splashing 8.7
wash 8.7
cold 8.6
art 8.6
nobody 8.5
black 8.4
plant 8.2
object 8.1
light 8
decoration 7.9
tree 7.7
grunge 7.7
health 7.6
frozen 7.6
ripple 7.6
bath 7.6
ice 7.6
drops 7.5
drink 7.5
religious 7.5
flowing 7.5
color 7.2
material 7.1
sculpture 7.1
surface 7
textured 7

Google
created on 2018-05-09

Color Analysis

Feature analysis

Amazon

Person 95.7%
Adult 91.6%
Bride 91.6%
Female 91.6%
Woman 91.6%

Categories

Imagga

paintings art 99.3%

Captions

Microsoft
created on 2018-05-09

a pair of white shoes 39%
a pair of shoes 38.9%
a close up of a pair of white shoes 33.4%