Human Generated Data

Title

Untitled (portrait of four women, Houston, Texas)

Date

1940s, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1018

Human Generated Data

Title

Untitled (portrait of four women, Houston, Texas)

People

Artist: Paul Gittings, American 1900 - 1988

Date

1940s, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.7
Person 99.7
Person 99.7
Person 99.6
Person 99.5
Face 95
People 91.2
Performer 83.4
Head 73.6
Photo Booth 70.3
Female 67
Hair 66.5
Family 66
Photography 60.3
Photo 60.3
Portrait 60.3
Advertisement 59.7
Collage 57.8
Poster 57.8

Imagga
created on 2021-12-14

kin 39.6
adult 36.2
portrait 31.1
pretty 30.8
daughter 30.6
sexy 30.5
attractive 30.1
people 29
fashion 26.4
model 25.7
person 24.9
women 24.5
love 24.5
lady 23.5
couple 23.5
happy 21.9
hair 21.4
skin 20.3
face 19.9
black 19.5
sensual 19.1
passion 18.8
hair spray 18.8
studio 18.2
man 18.2
happiness 18
groom 17.9
two 16.9
toiletry 16.9
male 16.6
smile 16.4
brunette 15.7
style 15.6
body 15.2
cute 15.1
human 14.3
posing 14.2
dark 14.2
cheerful 13.8
lifestyle 13.7
makeup 13.7
erotic 13.7
sensuality 13.6
gorgeous 13.6
romance 13.4
together 13.1
smiling 12.3
elegance 11.8
lovers 11.6
boyfriend 11.6
girlfriend 11.6
desire 11.5
blond 11.4
lips 11.1
long 11
girls 10.9
joy 10.9
lovely 10.7
mother 10.5
wife 10.4
youth 10.2
emotion 10.1
make 10
dress 9.9
holding 9.9
hand 9.9
romantic 9.8
fun 9.7
husband 9.7
sexual 9.6
loving 9.5
expression 9.4
looking 8.8
look 8.8
boy 8.7
eyes 8.6
elegant 8.6
togetherness 8.5
relationship 8.4
cosmetics 8.4
group 8.1
parent 7.9
intimacy 7.9
passionate 7.9
topless 7.8
kiss 7.8
men 7.7
child 7.7
tender 7.7
married 7.7
close 7.4
world 7.4
valentine 7.3
lingerie 7.2
handsome 7.1
modern 7
sibling 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

human face 98.7
person 97.9
posing 96.6
wall 95.7
monitor 95.7
clothing 95.5
smile 95
indoor 94
dress 91.2
woman 90.1
text 88.5
girl 82
baby 78.2
picture frame 72.9
wedding dress 70
bride 59.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-33
Gender Female, 99.6%
Calm 58.5%
Happy 33.7%
Confused 2.8%
Surprised 1.3%
Fear 1.2%
Sad 1.2%
Disgusted 0.8%
Angry 0.5%

AWS Rekognition

Age 19-31
Gender Female, 98.8%
Calm 80.6%
Happy 10.6%
Surprised 3.5%
Sad 1.2%
Disgusted 1.2%
Confused 1.1%
Fear 0.9%
Angry 0.9%

AWS Rekognition

Age 20-32
Gender Female, 99.4%
Calm 87.8%
Happy 3.2%
Surprised 2.7%
Confused 1.6%
Sad 1.5%
Angry 1.2%
Disgusted 1.1%
Fear 0.8%

AWS Rekognition

Age 21-33
Gender Female, 99.7%
Calm 67.2%
Happy 26.1%
Surprised 2.2%
Confused 1.3%
Sad 1.1%
Disgusted 0.9%
Fear 0.7%
Angry 0.5%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a photo 93.5%
a group of people posing for the camera 93.4%
a group of women posing for a photo 92.5%

Text analysis

Amazon

PH
808E PH
808E
XH 3808

Google

H 3 0 8 3ARE PH
3
3ARE
0
8
PH
H