Human Generated Data

Title

Untitled (woman in dress with floral design)

Date

c. 1940

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1274

Human Generated Data

Title

Untitled (woman in dress with floral design)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 96
Person 96
Text 94
Clothing 89.4
Apparel 89.4
Face 84
Sleeve 78.4
Female 71
Photography 67.6
Photo 67.6
Portrait 66.9
Finger 63.8
Handwriting 60.6
Accessory 57.7
Accessories 57.7
Jewelry 57.7
Necklace 57.7

Imagga
created on 2022-01-22

envelope 50.8
portrait 35.6
model 35
adult 34.3
attractive 34.3
pretty 33.6
container 32.7
hair 29.3
fashion 28.7
person 27.8
lady 23.5
dress 23.5
happy 23.2
face 22.7
brunette 22.7
people 22.3
lock 22.1
kimono 18.6
bride 18.2
smiling 18.1
cute 17.9
sexy 17.7
clothing 17.5
human 17.3
make 16.3
posing 16
women 15.8
wedding 15.6
makeup 15.6
skin 15.5
studio 15.2
robe 15.1
smile 15
black 14.6
sensual 14.6
pose 14.5
expression 13.7
style 13.4
garment 13.3
lifestyle 13
eyes 12.9
long 12.9
gorgeous 12.7
happiness 12.5
lips 12
looking 12
youth 11.9
love 11.8
sensuality 11.8
one 11.2
flower 10.8
bridal 10.7
hairstyle 10.5
flowers 10.4
cosmetics 10.3
tattoo 10.1
businesswoman 10
gown 9.8
business 9.7
elegance 9.2
alone 9.1
modern 9.1
home 8.8
look 8.8
bouquet 8.7
casual 8.5
feminine 8.4
holding 8.3
teenager 8.2
stylish 8.1
covering 8.1
cheerful 8.1
natural 8
romantic 8
body 8
book 7.8
married 7.7
glamor 7.7
marriage 7.6
head 7.6
laptop 7.5
care 7.4
lovely 7.1
interior 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

flower 93.7
human face 91.8
text 87.6
envelope 84.4
stationary 84
woman 82.1
person 69.8
smile 68.6
portrait 66.2
clothing 57.9
sketch 53.7
drawing 53.6
document 32.6
businesscard 16.1
picture frame 8.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Female, 100%
Calm 91.8%
Happy 3.6%
Confused 1.2%
Disgusted 1.1%
Sad 0.8%
Angry 0.7%
Surprised 0.4%
Fear 0.3%

Microsoft Cognitive Services

Age 42
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96%

Captions

Microsoft

a person taking a selfie 57%
a person taking a selfie in a room 56.9%
a close up of a person 56.8%

Text analysis

Amazon

head
sign

Google

hend
hend