Human Generated Data

Title

Untitled (mother and daughter)

Date

1970s

People

Artist: Christopher, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1654

Human Generated Data

Title

Untitled (mother and daughter)

People

Artist: Christopher, American 20th century

Date

1970s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 96
Face 96
Person 91.8
Finger 86.6
Hug 81.1
Portrait 72.2
Photography 72.2
Photo 72.2
Leisure Activities 68
Female 66.2
Person 45.3

Imagga
created on 2022-01-09

male 41.1
black 40.3
adolescent 38.4
man 38.3
beard 35.6
portrait 35.6
attractive 34.3
adult 33.7
person 31.7
juvenile 30.9
sexy 29
face 28.4
dark 28.4
couple 27
expression 25.6
love 25.3
people 23.5
model 23.4
eyes 21.5
guy 21.4
lips 20.4
fashion 19.6
sensual 19.1
handsome 18.7
baby 17.3
serious 17.2
hair 16.7
brother 16.4
skin 16.4
close 15.4
youth 15.4
casual 15.3
body 15.2
happy 15.1
human 15
brunette 14.8
emotion 14.8
masculine 14.6
lifestyle 14.5
studio 14.5
head 13.5
pretty 13.3
macho 13.2
passion 13.2
look 13.2
looking 12.8
erotic 12.7
romance 12.5
style 11.9
women 11.9
two 11.9
sensuality 11.8
dad 11.6
men 11.2
happiness 11
fetus 10.8
torso 10.7
smile 10.7
girlfriend 10.6
boy 10.4
father 10.1
cute 10.1
husband 9.8
parent 9.8
nose 9.8
kiss 9.8
world 9.7
hug 9.7
boyfriend 9.7
hands 9.6
pair 9.5
child 9.3
teenager 9.1
girls 9.1
confident 9.1
sibling 9.1
posing 8.9
sex 8.8
sexual 8.7
married 8.6
muscular 8.6
loving 8.6
wife 8.5
relationship 8.4
hand 8.4
valentine 8.2
lovely 8
vertebrate 8
embracing 7.8
pensive 7.8
lovers 7.8
naked 7.7
mouth 7.5
lady 7.3
eye 7.2
romantic 7.1
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.7
indoor 96.6
human face 94.2
black and white 92.4
person 92.1
girl 90.6
woman 69
kiss 66.2
nude 60.7
portrait 59.3
face 55.5
monochrome 54.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-36
Gender Female, 100%
Sad 81.1%
Calm 15.7%
Confused 1.3%
Fear 0.6%
Surprised 0.6%
Disgusted 0.3%
Angry 0.3%
Happy 0.1%

AWS Rekognition

Age 6-14
Gender Female, 100%
Calm 71%
Sad 25%
Fear 2.6%
Angry 0.4%
Surprised 0.4%
Confused 0.2%
Disgusted 0.2%
Happy 0.2%

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.8%

Captions

Microsoft

a man and a woman taking a selfie 38.9%
a man and woman taking a selfie 30.8%
a close up of a man and a woman taking a selfie 30.7%

Text analysis

Amazon

Myrea,
Squt Myrea, Co.
Co.
Christopher
Squt