Human Generated Data

Title

Laura, August 1983

Date

1983

People

Artist: Judith Black, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.327

Human Generated Data

Title

Laura, August 1983

People

Artist: Judith Black, American born 1945

Date

1983

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.327

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Poster 98.1
Advertisement 98.1
Person 97.2
Collage 77
Face 75.9
Clothing 75.3
Apparel 75.3

Clarifai
created on 2023-10-25

people 99.8
monochrome 98.7
adult 98.6
group 97.4
woman 97.4
portrait 96.5
wear 94.9
man 93.7
movie 92.5
fashion 86.1
two 85.4
actress 85.3
retro 84.6
actor 84.5
facial expression 83.9
three 83.8
music 81
art 79
nostalgia 78.4
mirror 78.2

Imagga
created on 2022-01-09

portrait 36.2
person 26.2
sexy 24.9
fashion 24.9
hair 24.6
adult 23.3
black 22.4
pretty 21
people 20.6
face 20.6
attractive 20.3
hair spray 19.3
model 17.1
brunette 16.6
toiletry 16.5
happy 16.3
sensual 15.5
style 14.8
sensuality 14.5
make 14.5
women 14.2
male 14.2
world 14
body 13.6
smile 13.5
man 13.4
lady 13
makeup 13
cute 12.9
painter 12.8
human 12.7
skin 12.7
posing 12.4
hairstyle 12.4
lifestyle 12.3
eyes 12.1
love 11.8
romantic 11.6
studio 11.4
two 11
happiness 11
couple 10.5
groom 10.3
hairdresser 10.2
gorgeous 10
dress 9.9
look 9.6
head 9.2
elegance 9.2
vintage 9.1
one 9
youth 8.5
blond 8.4
relaxation 8.4
joy 8.4
dark 8.4
fun 8.2
care 8.2
cheerful 8.1
smiling 8
vogue 7.7
luxury 7.7
elegant 7.7
clothing 7.6
hand 7.6
healthy 7.6
cosmetics 7.5
lips 7.4
20s 7.3
pose 7.2
looking 7.2
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

human face 97.7
person 95.8
clothing 94.3
text 92.6
black and white 83.6
gallery 79.9
woman 76.6
drawing 72.4
man 71.8
room 43.1
posing 41
picture frame 9.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 99.6%
Sad 81.8%
Calm 8%
Fear 7.4%
Angry 2.1%
Surprised 0.3%
Disgusted 0.2%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 41-49
Gender Female, 90.1%
Calm 96.2%
Angry 1.3%
Sad 0.9%
Confused 0.7%
Disgusted 0.3%
Happy 0.2%
Surprised 0.2%
Fear 0.2%

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Poster 98.1%

Categories

Text analysis

Amazon

VIOLATION

Google

HALE
VIOLATION HALE
VIOLATION