Human Generated Data

Title

Untitled (Berkeley)

Date

1982

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5239

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Berkeley)

People

Artist: Bill Dane, American born 1938

Date

1982

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.7
Person 99.7
Apparel 98.8
Clothing 98.8
Person 97.6
Pants 96.7
Shoe 70.1
Footwear 70.1
Denim 68.6
Curtain 63.3
Door 58.9
Jeans 58.1
Home Decor 57.8

Clarifai
created on 2019-11-15

people 99.2
adult 96.7
portrait 96
one 95.7
man 94.8
woman 94.5
indoors 88.5
fashion 87
wear 84.2
street 83.8
two 83.7
model 82.9
girl 82.8
music 81.8
business 77.1
sexy 73.5
door 73.2
room 73
actor 72.7
person 72.4

Imagga
created on 2019-11-15

person 33.5
adult 33
people 31.2
portrait 27.2
happy 25.7
model 25.7
clothing 25.1
fashion 24.9
attractive 23.8
standing 22.6
casual 22
lady 21.1
home 20.7
man 20.2
jeans 20.1
hair 19.8
pretty 19.6
cute 19.4
garment 19.3
male 18.7
posing 18.7
sexy 18.5
lifestyle 18.1
human 18
indoors 17.6
couple 17.4
smiling 16.6
miniskirt 16.2
face 15.6
happiness 14.9
skirt 14.8
blond 14.2
women 14.2
house 14.2
one 14.2
black 14.1
sensuality 13.6
covering 13.3
brunette 13.1
device 13
smile 12.8
looking 12.8
elegant 12
body 12
youth 11.9
sensual 11.8
door 11.7
look 11.4
cheerful 11.4
inside 11
indoor 11
elegance 10.9
interior 10.6
modern 10.5
urban 10.5
wall 10.4
child 10.3
pose 10
locker 9.9
style 9.6
together 9.6
clothes 9.4
two 9.3
head 9.2
joy 9.2
makeup 9.1
gorgeous 9.1
dress 9
sliding door 9
handsome 8.9
new 8.9
lovely 8.9
boy 8.7
love 8.7
fashionable 8.5
expression 8.5
adults 8.5
elevator 8.5
dark 8.4
city 8.3
holding 8.3
nice 8.2
fastener 8.2
consumer goods 8.2
vertical 7.9
flirt 7.8
hug 7.7
outfit 7.6
stand 7.6
leisure 7.5
shirt 7.4
brown 7.4
make 7.3
domestic 7.2
family 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 99
clothing 92.7
text 92.2
jeans 64.7
trousers 64.5
black and white 61.4
man 60.6

Feature analysis

Amazon

Person 99.7%
Jeans 58.1%

Captions

Microsoft

a person standing in front of a window 90.2%
a man and a woman standing in front of a window 67.4%
a person standing next to a window 67.3%