Human Generated Data

Title

Untitled (yak)

Date

c. 1950

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19761

Human Generated Data

Title

Untitled (yak)

People

Artist: Unidentified Artist,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Animal 96.6
Mammal 96.6
Cow 96.6
Cattle 96.6
Bull 79.8
Antelope 79.3
Wildlife 79.3
Horse 60.2
Bongo 59.3
Impala 55.3
Gazelle 55.2

Imagga
created on 2022-03-05

cattle 68.9
ox 63.3
horse 55.4
farm 45.5
bovine 43.4
ranch 33.3
pasture 32.6
rural 30.9
equine 28.4
horses 28.3
field 27.6
animals 26.9
cow 26.7
grass 26.1
meadow 22.4
ruminant 21.6
stallion 21.5
livestock 21.4
brown 21.4
wild 20.1
bull 19.8
mane 19.6
wildlife 16.1
mare 15.7
equestrian 15.7
agriculture 14.9
horn 13.7
mammals 13.7
countryside 12.8
outdoors 12.7
domestic 12.7
farming 12.3
water buffalo 12.1
pony 11.8
stable 11.8
grazing 11.8
head 11.8
outdoor 11.5
outside 11.1
pet 11
riding 10.7
standing 10.4
trees 9.8
portrait 9.7
old world buffalo 9.7
breed 9.7
looking 9.6
two 9.3
antelope 9.1
gallop 8.9
horns 8.9
forest 8.7
run 8.7
summer 8.4
park 8.2
rope 8.2
eye 8
country 7.9
cute 7.9
beef 7.9
face 7.8
eyes 7.8
farmland 7.8
menagerie 7.7
milk 7.6
one 7.5
water 7.3
deer 7.3
fall 7.2
sky 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 97.5
text 96.8
animal 94.8
horn 84.8
deer 74.8
black and white 71.6
cattle 70.9
antelope 59.5
antler 52.8

Feature analysis

Amazon

Cow 96.6%
Horse 60.2%

Captions

Microsoft

a cow standing next to a body of water 69.8%
a cow is standing in front of a body of water 68.1%
a cow standing in a body of water 68%