Human Generated Data

Title

Untitled (people walking out of ballroom)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16712

Human Generated Data

Title

Untitled (people walking out of ballroom)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 100
Apparel 100
Human 98.7
Person 98.7
Person 98.7
Robe 97.2
Fashion 97.2
Gown 95.5
Wedding 94
Bride 89.5
Wedding Gown 89.5
Person 81.3
Sleeve 67.9
Evening Dress 66.6
Female 60.3

Imagga
created on 2022-02-26

groom 68.2
bride 50.4
wedding 42.3
dress 41.6
love 33.9
bouquet 29.4
portrait 28.5
married 27.8
people 27.3
couple 25.3
happiness 25.1
person 24.9
adult 23.8
fashion 22.6
bridal 21.4
marriage 20.9
gown 20.5
flowers 18.3
happy 17.6
life 17.4
veil 16.7
clothing 16.1
two 16.1
sexy 16.1
celebration 16
face 15.6
lady 15.4
pretty 15.4
attractive 15.4
family 15.1
elegance 15.1
male 13.8
wed 13.8
looking 13.6
engagement 13.5
human 13.5
hair 13.5
model 13.2
mother 13.2
smiling 13
man 12.8
traditional 12.5
smile 12.1
women 11.9
romantic 11.6
world 11.5
flower 11.5
wife 11.4
brunette 11.3
romance 10.7
ceremony 10.7
interior 10.6
cheerful 10.6
standing 10.4
eyes 10.3
clothes 10.3
blond 10
posing 9.8
home 9.6
luxury 9.4
day 9.4
future 9.3
gorgeous 9.1
summer 9
one 9
husband 9
engaged 8.9
decision 8.8
necklace 8.8
black 8.5
old 8.4
pure 8.3
contestant 8
elation 7.9
wedding dress 7.9
cute 7.9
look 7.9
matrimony 7.9
child 7.9
innocent 7.9
tenderness 7.8
men 7.7
hope 7.7
parent 7.7
ready 7.7
youth 7.7
prepared 7.6
charming 7.6
pair 7.6
outdoors 7.5
holding 7.4
church 7.4
long 7.3
makeup 7.3
indoor 7.3
pose 7.2
holiday 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

wedding dress 99.5
bride 98.5
dress 97.2
text 94.5
person 92.8
man 90.4
wedding 87.5
clothing 81.9
woman 65.9

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 98.1%
Angry 35.3%
Happy 32.7%
Calm 18.3%
Sad 4.1%
Disgusted 3.2%
Surprised 2.5%
Fear 2.3%
Confused 1.6%

AWS Rekognition

Age 34-42
Gender Male, 63.5%
Sad 86.1%
Happy 5.7%
Calm 2.5%
Confused 2.2%
Angry 1.1%
Disgusted 1%
Fear 0.9%
Surprised 0.4%

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Fear 47.4%
Calm 40.3%
Sad 7.1%
Happy 2.2%
Confused 1.1%
Disgusted 0.9%
Angry 0.7%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a man and a woman standing in front of a window 76.1%
a man standing in front of a window 76%
a man standing next to a window 75.9%

Text analysis

Google

MJI7-- YT37A°2 - -XAGO
MJI7--
YT37A°2
-XAGO
-