Human Generated Data

Title

Laura and Dylan, October 1984

Date

1984

People

Artist: Judith Black, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.326

Human Generated Data

Title

Laura and Dylan, October 1984

People

Artist: Judith Black, American born 1945

Date

1984

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.3
Person 99.3
Clothing 97.3
Apparel 97.3
Furniture 84.4
Couch 74.2
Female 72.3
Bed 58.9
Woman 58.2
Building 58.1
Clock Tower 58.1
Architecture 58.1
Tower 58.1
Sleeve 58
Pub 56.4
Finger 55.3
Indoors 55.1

Imagga
created on 2022-01-09

sexy 46.6
fashion 39.2
attractive 39.2
model 36.6
adult 33.3
person 33.2
pretty 31.5
brunette 31.4
lady 29.2
portrait 29.1
black 26.4
sensual 25.5
hair 25.4
style 22.3
people 21.8
posing 21.3
lingerie 20.8
studio 20.5
gorgeous 19
dress 18.1
cute 17.2
one 17.2
smile 17.1
elegance 16
clothing 15.7
face 15.6
sitting 14.6
youth 14.5
looking 14.4
body 14.4
human 14.3
lovely 14.2
happy 13.8
sensuality 13.6
women 13.4
seductive 13.4
dark 13.4
erotic 13.3
blond 13.1
pose 12.7
stylish 12.7
fashionable 12.3
lifestyle 12.3
expression 11.9
long 11.9
interior 11.5
legs 11.3
clothes 11.2
elegant 11.1
garment 11.1
emotion 11.1
makeup 11
modern 10.5
hairstyle 10.5
casual 10.2
look 9.6
feminine 9.3
skirt 9.3
skin 9.3
make 9.1
music 9
indoors 8.8
long hair 8.7
vogue 8.7
standing 8.7
passion 8.5
hot 8.4
lips 8.3
20s 8.3
miniskirt 8.1
smiling 8
happiness 7.8
cover girl 7.8
eyes 7.7
party 7.7
desire 7.7
glamor 7.7
slim 7.4
dance 7.3
love 7.1

Google
created on 2022-01-09

Black 89.5
Style 83.8
Black-and-white 82.1
Lap 76.1
Thigh 75.7
Flash photography 74.7
Monochrome photography 71.8
Vintage clothing 70.5
Knee 70.1
Event 69.2
Classic 66.3
Chair 65.6
Sitting 65.4
Monochrome 64.5
Stock photography 63.6
Room 63.5
Pattern 61.9
Font 61.2
Fashion design 57.4
Fun 56.4

Microsoft
created on 2022-01-09

text 99.1
person 98.1
human face 91.5
clothing 87.2
smile 63.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-19
Gender Female, 57.1%
Calm 96.9%
Sad 1.4%
Angry 0.6%
Fear 0.6%
Disgusted 0.2%
Confused 0.2%
Surprised 0.1%
Happy 0%

AWS Rekognition

Age 18-26
Gender Female, 100%
Angry 67.6%
Calm 20.9%
Surprised 4.1%
Confused 3.2%
Fear 1.7%
Sad 1.2%
Disgusted 0.8%
Happy 0.5%

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Clock Tower 58.1%

Captions

Microsoft

a woman sitting in a room 88.5%
a woman sitting on a bed 55.9%
a woman sitting in a chair 55.8%