Human Generated Data

Title

Untitled (schnauzer and woman)

Date

c. 1950

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19753

Human Generated Data

Title

Untitled (schnauzer and woman)

People

Artist: Unidentified Artist,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Poodle 91.7
Canine 91.7
Animal 91.7
Mammal 91.7
Pet 91.7
Chair 79.5
Furniture 79.5
Person 79
Human 79
Terrier 64.5
Cat 63.7
Dog 63.7

Imagga
created on 2022-03-05

terrier 100
hunting dog 100
dog 84.3
canine 44.4
domestic animal 39.5
cute 12.2
person 11.8
old 11.1
toy 11
fur 10.2
wildlife 9.8
animals 9.2
face 9.2
adult 9
wild 8.7
sculpture 8.6
stone 8.4
people 8.4
fashion 8.3
sky 8.3
pet 8.3
outdoors 8.2
domestic 8.1
sexy 8
water 8
little 7.9
black 7.8
portrait 7.8
sitting 7.7
ears 7.7
head 7.5
silhouette 7.4
man 7.4
brown 7.3
male 7.1

Google
created on 2022-03-05

Black 89.6
Water dog 87.1
Carnivore 85.3
Dog breed 82.8
Dog 82.2
Rectangle 81.1
Companion dog 76.8
Art 75.8
Snout 74.3
Working animal 71.5
Sporting Group 69.2
Font 67.8
Fur 67.8
Terrestrial animal 67.1
Artifact 65.8
Metal 65.4
Canidae 64.3
Livestock 62.9
Visual arts 61.8
Table 61.4

Microsoft
created on 2022-03-05

text 96.7
statue 90.5
animal 90.2
standing 87.1
black and white 70.3
black 65.5
mammal 62.4
carnivore 58.7
dog 40.5
cat 18.2

Feature analysis

Amazon

Person 79%
Cat 63.7%
Dog 63.7%

Captions

Microsoft

a bear that is standing in front of a window 58.9%
a cat that is standing in front of a window 49.1%
a dog standing in front of a window 49%