Human Generated Data

Title

Untitled (Hollywood)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5186

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Hollywood)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5186

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.9
Human 99.9
Person 99
Advertisement 96.2
Billboard 81
Path 73.8
Poster 73.4
Outdoors 73.3
Sunglasses 73.1
Accessories 73.1
Accessory 73.1
Text 72.9
Apparel 72.5
Clothing 72.5
Person 64.8
Shorts 59.2
Skin 55.9

Clarifai
created on 2019-11-15

people 99.6
street 98.8
monochrome 96.2
one 96.2
adult 94.4
woman 92.7
group 92.6
man 92.4
group together 89.2
two 86.6
wear 86.5
administration 85.6
child 85.6
portrait 82.9
actress 82.2
pavement 81.9
three 81.4
road 81.3
vehicle 81
city 80.5

Imagga
created on 2019-11-15

sexy 20.1
person 19.9
model 19.4
hair 19
pretty 18.9
attractive 17.5
body 16.8
people 16.7
black 16.5
portrait 15.5
fashion 14.3
human 14.2
adult 14.2
posing 14.2
lifestyle 13.7
skin 13.5
outdoor 13
sensuality 12.7
dress 12.6
pole 12.6
man 12.1
city 11.6
wet 11.6
bikini 11.5
lady 11.4
water 11.3
happy 11.3
women 11.1
sketch 11
relaxation 10.9
one 10.4
shower 10.2
rod 10.2
dark 10
face 9.9
world 9.7
wall 9.7
summer 9.6
happiness 9.4
drawing 9.3
cute 9.3
head 9.2
power 9.2
snow 9.2
street 9.2
gorgeous 9.1
old 9.1
swimsuit 9
healthy 8.8
urban 8.7
slim 8.3
girls 8.2
fitness 8.1
clothing 8
structure 7.8
sport 7.8
telephone 7.7
garment 7.7
health 7.6
equipment 7.6
bath 7.6
heat 7.4
male 7.4
alone 7.3
pay-phone 7.3
sensual 7.3
looking 7.2
light 7.1
smile 7.1
look 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

outdoor 99.4
text 99.1
black and white 95.7
billboard 94.3
street 89.6
person 85.8
clothing 80.4
posing 79.3
sign 70.5
woman 63.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Male, 54.9%
Confused 45.2%
Surprised 48%
Calm 46.4%
Fear 46.6%
Disgusted 45%
Happy 45.1%
Sad 45.1%
Angry 48.6%

AWS Rekognition

Age 20-32
Gender Male, 54.8%
Angry 45.3%
Fear 45.1%
Confused 45.1%
Calm 54.3%
Sad 45.1%
Disgusted 45%
Surprised 45.1%
Happy 45.1%

AWS Rekognition

Age 34-50
Gender Male, 50.1%
Fear 45%
Angry 45%
Happy 45%
Disgusted 45%
Confused 45%
Calm 45%
Sad 54.9%
Surprised 45%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Sunglasses 73.1%

Text analysis

Amazon

the
where the
come
where
RED
AEAVEN
ANTED
avor
AEAVEN RED WAIT
SCOTT
Marhom avor is.
WAIT
is.
Marhom
come 10
PLAYNGFA
nal-Westwnod
OW PLAYNGFA nal-Westwnod
10
OW
PCIFIC
PCIFIC oUDoc
oUDoc

Google

HEAVEN
WAI
RED
OW
FA
PACIFIC
nal-West
and
Come to where the ANTED HEAVEN WAI RED SCOTT OW PLAYNG FA iHollyw PACIFIC OUTD0 nal-West wood and
Come
to
where
the
ANTED
SCOTT
PLAYNG
iHollyw
OUTD0
wood