Human Generated Data

Title

Untitled (Beverly Hills)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5175

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Beverly Hills)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5175

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.7
Human 99.7
Person 99.4
Person 99
Person 98.8
Clothing 87.9
Apparel 87.9
Fire Hydrant 86.3
Hydrant 86.3
Hair 63.5
Room 58.9
Indoors 58.9
Face 57.4
Sleeve 55.6

Clarifai
created on 2019-11-15

people 99.8
adult 98.3
woman 98.1
two 97.5
group 97.3
man 95
child 92.4
wear 92.2
three 92
street 91.2
group together 90.7
monochrome 90.5
one 89.9
actress 89.6
portrait 89.6
four 86.6
music 81.9
movie 80.9
several 80.6
administration 79.4

Imagga
created on 2019-11-15

adult 31.8
sexy 28.1
people 27.3
attractive 27.3
portrait 27.2
person 27
domestic 24.5
fashion 24.1
groom 23.8
dress 23.5
hair 23
pretty 21.7
man 21.5
happy 20.1
model 19.5
face 19.2
women 19
hairdresser 18.9
clothing 18.6
brunette 18.3
couple 18.3
male 18.1
elegance 17.6
black 17.3
sensual 17.3
style 17.1
two 16.1
love 15.8
bride 15.4
interior 15
salon 14.5
happiness 14.1
gorgeous 13.6
mother 13.4
cute 12.9
wedding 12.9
home 12.8
human 12.8
hand blower 12.5
passion 12.2
lady 12.2
smile 12.1
room 11.9
lingerie 11.6
romantic 11.6
lifestyle 11.6
posing 11.6
luxury 11.2
casual 11
skin 11
sensuality 10.9
smiling 10.9
modern 10.5
one 10.5
elegant 10.3
youth 10.2
dryer 10
blower 10
make 10
device 9.9
family 9.8
bedroom 9.7
together 9.6
looking 9.6
body 9.6
makeup 9.2
garment 9.1
studio 9.1
girls 9.1
indoors 8.8
look 8.8
desire 8.7
parent 8.5
consumer goods 8.4
house 8.4
appliance 8.1
urban 7.9
professional 7.9
feather boa 7.9
sepia 7.8
eyes 7.8
sitting 7.7
expression 7.7
outdoor 7.6
husband 7.6
hand 7.6
wife 7.6
window 7.6
head 7.6
legs 7.6
city 7.5
vintage 7.4
lips 7.4
alone 7.3
business 7.3
pose 7.3
romance 7.1
handsome 7.1
lovely 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 99.1
human face 98.6
clothing 98.1
text 97.8
smile 94.5
woman 80.9
girl 78.1
black and white 66.4
dress 63.3
posing 51.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-40
Gender Female, 99.4%
Calm 31.6%
Sad 3.4%
Fear 1.1%
Disgusted 9.9%
Confused 10%
Surprised 1.6%
Happy 2.2%
Angry 40.2%

AWS Rekognition

Age 40-58
Gender Male, 50.4%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Fear 49.5%
Calm 49.5%
Confused 49.5%
Surprised 49.5%
Sad 50.4%

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Fire Hydrant 86.3%

Categories