Human Generated Data

Title

Untitled (woman with bonnet and children)

Date

1930s

People

Artist: Dorothea Lange, American 1895 - 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Sedgewick Memorial Collection, 2.2002.709

Human Generated Data

Title

Untitled (woman with bonnet and children)

People

Artist: Dorothea Lange, American 1895 - 1965

Date

1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99
Apparel 99
Hat 98.7
Person 98.5
Human 98.5
Person 95.2
Newborn 94.3
Baby 94.3
Person 88.5
Sleeve 82.4
Face 82.2
Female 81.2
Finger 76.3
Woman 66.6
People 65.9
Photography 63.1
Portrait 63.1
Photo 63.1
Girl 58.6
Bonnet 57.1
Person 48.9

Imagga
created on 2022-01-22

mother 100
parent 81.8
happy 32
kin 30.3
portrait 29.8
people 29.6
family 28.5
love 27.6
happiness 27.4
couple 26.1
child 25.6
home 25.5
adult 23.3
daughter 23.2
smiling 22.4
man 21.5
attractive 20.3
grandma 20.1
together 19.3
face 19.2
male 18.8
father 17.2
cute 17.2
smile 17.1
husband 16.2
fashion 15.8
women 15
sitting 14.6
lifestyle 14.5
casual 14.4
loving 14.3
kid 14.2
boy 13.9
cheerful 13.8
dress 13.6
couch 13.5
bride 13.4
married 13.4
model 13.2
pretty 12.6
elderly 12.5
indoors 12.3
senior 12.2
relationship 12.2
sexy 12.1
dad 11.9
aged 11.8
person 11.4
wife 11.4
looking 11.2
old 11.2
hair 11.1
expression 11.1
wedding 11
two 11
indoor 11
joy 10.9
caring 10.8
blond 10.8
lady 10.6
sofa 10.5
human 10.5
females 10.4
adults 10.4
youth 10.2
baby 10.2
girls 10
holding 9.9
groom 9.7
clothing 9.6
day 9.4
camera 9.2
elegance 9.2
20s 9.2
one 9
sibling 8.4
mature 8.4
studio 8.4
room 8.2
posing 8
body 8
bonding 7.8
eyes 7.8
hug 7.8
affection 7.7
jeans 7.6
skin 7.6
laughing 7.6
togetherness 7.6
house 7.5
son 7.5
close 7.4
adorable 7.4
joyful 7.4
children 7.3
sensuality 7.3
gorgeous 7.3
black 7.2
childhood 7.2
romance 7.1
handsome 7.1
lovely 7.1
look 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

human face 98.4
person 98.2
baby 97.9
text 94.1
toddler 91.6
clothing 90.8
child 82.7
smile 73.7
black and white 66
woman 61.8
picture frame 6.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Female, 100%
Calm 89.4%
Sad 10.5%
Confused 0%
Angry 0%
Fear 0%
Surprised 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 4-12
Gender Male, 99.8%
Sad 95.8%
Fear 3.5%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Calm 0.1%
Surprised 0%
Happy 0%

AWS Rekognition

Age 0-6
Gender Female, 84.2%
Sad 49.6%
Happy 45.7%
Calm 1.3%
Fear 1%
Angry 0.9%
Disgusted 0.6%
Surprised 0.6%
Confused 0.5%

AWS Rekognition

Age 29-39
Gender Male, 99.4%
Happy 65.1%
Calm 28.6%
Angry 2.4%
Sad 2.3%
Confused 0.5%
Disgusted 0.4%
Surprised 0.4%
Fear 0.2%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 2
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Hat 98.7%
Person 98.5%

Captions

Microsoft

an old photo of a woman 90.1%
old photo of a woman 88.9%
a woman holding a baby 62%