Human Generated Data

Title

Untitled (woman with two dogs)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16614

Human Generated Data

Title

Untitled (woman with two dogs)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-12

Human 97.1
Apparel 96
Clothing 96
Person 95.5
Dress 94
Female 92.9
Home Decor 91.3
Woman 80.2
Sleeve 68
Girl 66.6
Plant 65.7
People 63.8
Animal 62.8
Mammal 62.8
Face 61.7
Pet 61.2
Leisure Activities 60.3
Photography 60.2
Photo 60.2
Canine 60
Outdoors 59.8
Person 57.5
Drawing 57.5
Art 57.5

Imagga
created on 2022-02-12

groom 60.1
bride 29
dress 28
love 26.8
wedding 26.7
people 24.6
portrait 22.7
person 22
fountain 22
couple 20
happiness 19.6
married 18.2
adult 17.5
bouquet 17.3
structure 16.5
statue 16.1
clothing 15.2
gown 14.7
fashion 14.3
romantic 14.3
face 14.2
model 14
lifestyle 13.7
human 13.5
man 13.4
mother 12.8
two 12.7
hair 12.7
bridal 12.6
attractive 12.6
happy 12.5
wed 11.8
veil 11.8
celebration 11.2
net 10.8
life 10.8
lady 10.6
marriage 10.4
wife 10.4
summer 10.3
art 10
sculpture 10
smile 10
romance 9.8
husband 9.8
family 9.8
outdoors 9.7
sexy 9.6
flowers 9.6
world 9.3
male 9.3
purity 9.2
outdoor 9.2
old 9.1
skirt 9
garment 8.7
sitting 8.6
clothes 8.4
head 8.4
pure 8.3
care 8.2
girls 8.2
sensuality 8.2
pose 8.2
religion 8.1
home 8
women 7.9
cute 7.9
engaged 7.9
innocent 7.8
men 7.7
engagement 7.7
pretty 7.7
wall 7.7
flower 7.7
youth 7.7
loving 7.6
future 7.4
closeup 7.4
parent 7.4
church 7.4
decoration 7.4
costume 7.4
detail 7.2
interior 7.1
day 7.1
child 7

Google
created on 2022-02-12

Microsoft
created on 2022-02-12

person 97.3
sketch 92.1
drawing 91.2
clothing 90
text 88
outdoor 87.1
human face 84
woman 69.9
dress 65.8

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 98.4%
Fear 28.6%
Happy 28.2%
Angry 19.6%
Surprised 11.8%
Calm 4%
Confused 2.9%
Disgusted 2.6%
Sad 2.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.5%

Captions

Microsoft

a man sitting in front of a window 60.7%
a man standing in front of a window 60.6%
a man sitting in front of a building 60.5%