Human Generated Data

Title

Untitled (Berkeley)

Date

1982

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, On deposit from the Carpenter Center for the Visual Arts, gift of the artist, P1982.414

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Berkeley)

People

Artist: Bill Dane, American born 1938

Date

1982

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, On deposit from the Carpenter Center for the Visual Arts, gift of the artist, P1982.414

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.5
Person 99.5
Person 98.7
Furniture 98
Jeans 94.5
Pants 94.5
Denim 94.5
Clothing 94.5
Apparel 94.5
Hug 90.6
Chair 77.3
Make Out 62
Bar Stool 61.9
Pub 57.8

Clarifai
created on 2019-11-15

people 99.7
two 96.9
wear 96.8
adult 95.5
woman 94.9
man 93.4
group 89.9
actress 87.9
dress 87.6
three 86.1
fashion 85.7
outfit 85.4
one 84.9
movie 84.1
portrait 83.8
monochrome 83.5
music 82.2
group together 80.6
street 80.4
actor 76.1

Imagga
created on 2019-11-15

sexy 50.6
model 37.4
adult 35
attractive 33.6
fashion 30.9
hair 27.8
pretty 27.3
person 27.1
body 26.4
people 25.1
black 24.6
sensual 23.7
dark 23.4
portrait 23.3
device 22.6
pay-phone 21.6
posing 20.4
style 20
face 19.9
passion 19.8
telephone 19.3
sensuality 19.1
women 19
brunette 17.4
one 17.2
erotic 16.5
human 16.5
electronic equipment 16.2
studio 16
standing 15.7
lady 14.6
skin 14.4
slim 13.8
elegance 13.4
equipment 13.2
happy 13.2
make 12.7
gorgeous 12.7
lingerie 12.5
seductive 12.4
cute 12.2
couple 12.2
makeup 11.9
two 11.9
dress 11.8
blond 11.3
looking 11.2
clothing 10.8
smile 10.7
male 10.6
interior 10.6
desire 10.6
clothes 10.3
youth 10.2
lips 10.2
man 10.1
stockings 9.8
lovely 9.8
vogue 9.7
sexual 9.6
restraint 9.6
elegant 9.4
call 9.4
lifestyle 9.4
girls 9.1
wet 8.9
passionate 8.8
urban 8.7
underwear 8.7
love 8.7
handcuff 8.5
inside 8.3
romantic 8
indoors 7.9
splashes 7.8
shower 7.8
bedroom 7.7
sitting 7.7
shoes 7.7
expression 7.7
jeans 7.6
legs 7.5
drops 7.5
rain 7.5
water 7.3
20s 7.3
pose 7.3
stylish 7.2
home 7.2
costume 7.1
together 7
modern 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 100
text 99.5
kiss 92
indoor 86.9
standing 81.3
clothing 79.5
dance 78.2
posing 37.7
crowd 0.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-28
Gender Female, 52.1%
Calm 49.7%
Fear 45.1%
Sad 50.2%
Happy 45%
Disgusted 45%
Surprised 45%
Confused 45%
Angry 45%

AWS Rekognition

Age 32-48
Gender Male, 72.2%
Fear 1.8%
Sad 28.4%
Disgusted 0.4%
Happy 10.6%
Angry 1.9%
Calm 53.9%
Surprised 1.6%
Confused 1.6%

AWS Rekognition

Age 29-45
Gender Female, 57.9%
Angry 10.7%
Sad 1.5%
Confused 0.5%
Surprised 1.4%
Happy 65%
Disgusted 0.3%
Calm 20.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Jeans 94.5%
Chair 77.3%

Categories

Text analysis

Amazon

RIME

Google

RIME
RIME