Human Generated Data

Title

Untitled (Fanny and Ada Aronson)

Date

1910s

People

Artist: Diran Studio, American active 1890s-1900s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2915

Human Generated Data

Title

Untitled (Fanny and Ada Aronson)

People

Artist: Diran Studio, American active 1890s-1900s

Date

1910s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.8
Apparel 99.8
Human 99.4
Person 99.4
Person 99.4
Robe 87.9
Fashion 87.9
Gown 85.7
Art 85.6
Evening Dress 84.5
Painting 76.4
Wedding 73.1
Female 70.1
Wedding Gown 59
People 57.2

Imagga
created on 2021-12-14

groom 100
bride 53
dress 47.9
wedding 44.1
couple 36.6
love 36.3
bouquet 33.1
married 29.7
people 29.6
marriage 27.5
happiness 25.9
kin 24.2
happy 23.2
portrait 22
fashion 21.9
flowers 21.7
person 21.1
mother 19.4
man 18.8
adult 18.8
gown 18.7
bridal 18.5
two 17.8
veil 17.6
celebration 17.5
romance 17
wed 16.7
ceremony 16.5
male 16.3
wife 16.1
elegance 15.1
face 14.9
smile 14.3
romantic 14.3
outdoors 14.2
flower 13.8
day 13.3
posing 13.3
attractive 13.3
parent 13.3
outdoor 13
lady 13
summer 12.9
commitment 12.8
husband 12.4
park 12.3
together 12.3
engagement 11.6
family 11.6
human 11.2
old 11.1
church 11.1
matrimony 10.8
holding 10.7
cheerful 10.6
loving 10.5
standing 10.4
women 10.3
life 10.1
cute 10
joy 10
clothing 9.7
youth 9.4
future 9.3
pretty 9.1
looking 8.8
smiling 8.7
model 8.6
wall 8.5
relationship 8.4
clothes 8.4
black 8.4
hand 8.4
traditional 8.3
style 8.2
blond 8.1
lifestyle 7.9
engaged 7.9
art 7.8
outside 7.7
tender 7.7
child 7.6
rose 7.5
fun 7.5
event 7.4
joyful 7.4
new 7.3
pose 7.2
suit 7.2
hair 7.1
interior 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.4
clothing 98.3
dress 96.5
wall 95.7
person 95.2
old 95.1
woman 92.5
wedding dress 90.4
window 85.9
black 81.4
white 79.5
bride 78.8
vintage clothing 72
human face 66.3
smile 54.9
posing 41.7
picture frame 16.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Female, 93.8%
Calm 91.5%
Angry 3.5%
Fear 2%
Sad 1.7%
Surprised 0.7%
Confused 0.4%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 12-22
Gender Female, 89.6%
Calm 95.6%
Angry 1.5%
Happy 0.9%
Sad 0.7%
Surprised 0.5%
Disgusted 0.4%
Fear 0.2%
Confused 0.2%

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 76.4%

Captions

Microsoft

a vintage photo of a person standing in front of a window 86.7%
a person standing in front of a window 86.6%
a vintage photo of a person 86.5%

Text analysis

Amazon

98
BOSTON
DIRAN
98 Court St.
Court St.

Google

98
DIRAN 98 COURT ST. BOSTON
COURT
ST.
BOSTON
DIRAN