Human Generated Data

Title

Untitled (girl standing, holding two dolls)

Date

1935

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1867

Human Generated Data

Title

Untitled (girl standing, holding two dolls)

People

Artist: Hamblin Studio, American active 1930s

Date

1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 97.8
Apparel 97.8
Human 97.5
Dress 93.5
Person 93.4
Fashion 83.9
Robe 83.9
Female 81.1
Face 75.4
Gown 70
Woman 68
Evening Dress 66.7
Photo 63.4
Portrait 63.4
Photography 63.4
Standing 60.6
Sailor Suit 59.8
Sleeve 59.7
Long Sleeve 58.6
Home Decor 56.3

Imagga
created on 2021-12-14

bride 69.5
wedding 54.3
dress 54.2
marriage 37
married 36.4
love 36.3
bouquet 35.7
groom 35.2
portrait 30.4
veil 29.4
people 29
couple 28.8
happiness 28.2
gown 27.3
adult 24.9
happy 24.4
flowers 23.5
person 22.6
fashion 22.6
celebration 22.3
bridal 21.4
face 20.6
man 18.8
cheerful 18.7
posing 18.7
elegance 18.5
romance 17.9
smile 17.8
male 17.1
attractive 16.8
human 16.5
model 16.3
wife 16.1
brass 15.6
ceremony 15.5
engagement 15.4
flower 15.4
clothing 15.2
women 15
wed 14.7
lady 14.6
smiling 14.5
two 14.4
outfit 14
cute 13.6
hair 13.5
romantic 13.4
day 13.3
pretty 13.3
clothes 13.1
outdoor 13
matrimony 12.8
wind instrument 12.4
lifestyle 12.3
looking 12
outdoors 11.9
summer 11.6
family 11.6
park 10.7
church 10.2
newlywed 10
pose 10
studio 9.9
one 9.7
together 9.6
joy 9.2
makeup 9.2
sensuality 9.1
engaged 8.9
commitment 8.9
black 8.8
musical instrument 8.8
necklace 8.7
husband 8.7
standing 8.7
life 8.6
eyes 8.6
men 8.6
outside 8.6
blond 8.5
art 8.4
event 8.3
holding 8.3
innocent 8.1
sexy 8
nuptials 7.9
holiday 7.9
bright 7.9
decision 7.8
party 7.7
wall 7.7
ready 7.7
youth 7.7
relationship 7.5
fun 7.5
traditional 7.5
contestant 7.5
future 7.5
style 7.4
new 7.3
suit 7.2
interior 7.1

Microsoft
created on 2021-12-14

text 99.5
clothing 96.8
posing 93.8
window 91.6
old 87.9
person 86.5
white 83.7
standing 83.5
black 75.3
dress 71.4
human face 67.1
vintage clothing 56.4
vintage 48.6
picture frame 23.6

Face analysis

Amazon

Google

AWS Rekognition

Age 21-33
Gender Male, 68.2%
Calm 71.4%
Surprised 17.3%
Confused 6.9%
Happy 2.4%
Sad 1.3%
Angry 0.3%
Disgusted 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.4%

Captions

Microsoft

a vintage photo of a person standing posing for the camera 94.3%
a vintage photo of a person standing in front of a window 90.1%
a vintage photo of a girl 90%