Human Generated Data

Title

"Harvard Men"

Date

c. 1969

People

Artist: Steven Liebman, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Robert M. Sedgwick II Fund, 2.2002.792

Human Generated Data

Title

"Harvard Men"

People

Artist: Steven Liebman, American 20th century

Date

c. 1969

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Person 99.4
Person 99.4
Finger 90
Face 89.1
Crowd 71.9
People 71.7
Smoke 69
Hair 61.3
Portrait 57.2
Photo 57.2
Photography 57.2

Imagga
created on 2022-01-22

couple 42.7
love 40.3
man 37
adult 36.3
people 35.7
portrait 35
male 34.9
brother 32.4
happy 28.2
attractive 27.3
happiness 24.3
black 24
face 23.5
together 22.8
person 22
sibling 21.7
romance 21.5
smile 21.4
husband 21
lifestyle 19.5
two 19.5
sexy 19.3
close 18.9
pretty 17.5
lovers 17.4
eyes 17.2
romantic 16.9
fashion 16.6
girlfriend 16.4
casual 16.1
smiling 15.9
hair 15.9
cute 15.8
kiss 15.6
looking 15.2
wife 15.2
handsome 15.2
passion 15.1
expression 14.5
boyfriend 14.5
family 14.2
guy 14.2
joy 13.4
loving 13.4
model 13.2
relationship 13.1
brunette 13.1
cheerful 13
parent 12.5
married 12.5
friends 12.2
outdoors 12
one 12
sensual 11.8
passionate 11.8
hug 11.6
studio 11.4
world 11.3
human 11.3
mother 11.1
women 11.1
child 11
kissing 10.8
erotic 10.6
lady 10.6
look 10.5
marriage 10.5
youth 10.2
emotion 10.2
dad 9.8
embrace 9.8
hugging 9.8
daughter 9.5
head 9.2
father 9.1
blond 8.7
boy 8.7
engagement 8.7
lips 8.3
joyful 8.3
holding 8.3
fun 8.2
girls 8.2
sensuality 8.2
valentine 8.2
style 8.2
gorgeous 8.2
posing 8
intimacy 7.9
embracing 7.8
tenderness 7.8
dating 7.8
outside 7.7
serious 7.6
skin 7.6
adults 7.6
pair 7.6
togetherness 7.6
dark 7.5
teen 7.4
kid 7.1
day 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.1
text 97.6
human face 97.1
portrait 54.2
black and white 52.5
hair 45.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Female, 100%
Confused 59.8%
Calm 14.4%
Sad 7.2%
Angry 7.1%
Surprised 4.3%
Disgusted 3.1%
Happy 2.3%
Fear 1.7%

AWS Rekognition

Age 7-17
Gender Female, 89.2%
Fear 69.3%
Sad 20.4%
Surprised 3.6%
Disgusted 1.8%
Calm 1.7%
Angry 1.5%
Happy 0.9%
Confused 0.9%

AWS Rekognition

Age 12-20
Gender Male, 74.8%
Sad 76.2%
Fear 13.5%
Calm 7.1%
Surprised 1.4%
Angry 0.8%
Disgusted 0.6%
Confused 0.2%
Happy 0.2%

Microsoft Cognitive Services

Age 11
Gender Male

Microsoft Cognitive Services

Age 22
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a person talking on a cell phone 38.6%
a man and a woman looking at the camera 38.5%
a person looking at the camera 38.4%