Human Generated Data

Title

HEAD

Date

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Mrs. Langdon Warner, 1958.172

Human Generated Data

Title

HEAD

People
Date

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Figurine 96.7
Sculpture 93.4
Art 93.4
Head 92.5
Archaeology 78.8
Mammal 65.2
Wildlife 65.2
Bear 65.2
Animal 65.2
Wood 60.2
Statue 55.8

Imagga
created on 2022-06-11

hippopotamus 90.5
ungulate 56.6
mammal 56.1
wildlife 31.2
wild 22.6
piggy bank 22.3
water 21.3
safari 18.3
savings bank 18
container 15.5
sea 14.8
animals 13.9
statue 13.9
bear 13.7
lion 13.7
sculpture 13.4
ice bear 13.4
head 12.6
farm 12.5
ocean 12.4
park 12.3
sand 12.3
travel 12
ears 11.6
conservation 11.3
rhinoceros 11.3
reserve 10.7
stone 10.2
mammals 9.7
zoo 9.7
pig 9.6
rock 9.5
nose 9.5
grass 9.5
rocks 9.4
beach 9.3
outdoor 9.2
ecology 9.2
religion 9
outdoors 9
herbivore 8.8
cute 8.6
dangerous 8.6
elephant 8.4
fur 8.4
art 8.1
eye 8
body 8
close 8
polar 7.9
snout 7.9
mud 7.9
horn 7.8
endangered 7.8
life 7.8
ancient 7.8
eyes 7.7
pork 7.7
winter 7.7
marine 7.6
temple 7.6
menagerie 7.6
swine 7.5
east 7.5
natural 7.4
arctic 7.3
island 7.3
danger 7.3
dirty 7.2
wet 7.2
portrait 7.1
river 7.1
face 7.1

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

statue 95.3
sculpture 91.2
water 90.3
text 82.3
mammal 71.5
black and white 59.7

Feature analysis

Amazon

Bear 65.2%

Captions

Microsoft

a cow looking at the camera 77.3%
a cow standing on top of a field 64%
a cow standing in a field 63.9%