Human Generated Data

Title

Untitled (woman on couch with dog)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17282

Human Generated Data

Title

Untitled (woman on couch with dog)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17282

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 84
Apparel 84
Furniture 82.9
Human 78.6
Lamp 74.6
Table Lamp 71
Person 69
Indoors 66.7
Female 66.5
Couch 65.6
Animal 65.3
Canine 60.6
Mammal 60.6
Living Room 59.9
Room 59.9
Pet 59.3

Clarifai
created on 2023-10-29

people 99.6
cat 98.9
dog 98.3
one 97.7
portrait 97.6
pet 97.5
mammal 97
canine 96.9
monochrome 96.4
room 95.9
adult 94.7
two 94.6
home 94.4
kitten 92.9
animal 92.8
furniture 91.9
woman 88.9
music 86.6
group 85.1
street 84.9

Imagga
created on 2022-02-26

man 30.9
male 22
person 21.6
people 19
adult 18.9
black 16.2
lifestyle 15.2
microphone 14.1
couple 13.9
portrait 13.6
device 13.5
musical instrument 12.8
face 12.8
hair 12.7
sexy 12
holding 11.6
human 11.2
two 11
model 10.9
music 10.9
dress 10.8
lady 10.5
stringed instrument 10.4
sitting 10.3
love 10.3
chair 10
hand 9.9
hands 9.6
men 9.4
romantic 8.9
home 8.8
smiling 8.7
bride 8.6
drinking 8.6
husband 8.6
room 8.6
wife 8.5
youth 8.5
stage 8.5
wind instrument 8.4
relationship 8.4
sound 8.4
head 8.4
pretty 8.4
relaxation 8.4
dark 8.4
leisure 8.3
fashion 8.3
entertainment 8.3
handsome 8
guitar 8
women 7.9
child 7.9
look 7.9
play 7.8
musician 7.7
marriage 7.6
player 7.5
grandfather 7.5
close 7.4
style 7.4
glasses 7.4
guy 7.4
emotion 7.4
wedding 7.4
alone 7.3
indoor 7.3
sensual 7.3
blond 7.2
romance 7.1
posing 7.1
night 7.1
singer 7
indoors 7
performer 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.2
indoor 93.8
cat 83.4
black and white 79.9

Color Analysis

Feature analysis

Amazon

Person
Person 69%

Categories

Captions

Microsoft
created on 2022-02-26

a person holding a cat 50.3%
a person holding a cat 49.8%
a person holding a dog 49.7%

Text analysis

Amazon

KODMK
KODMK OVELLA
OVELLA
P.O