Human Generated Data

Title

Untitled (dressing the bride, Brookline, Massachusetts)

Date

1940, printed later

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.945

Human Generated Data

Title

Untitled (dressing the bride, Brookline, Massachusetts)

People

Artist: Samuel Cooper, American active 1950s

Date

1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Apparel 98.4
Clothing 98.4
Person 98.3
Human 98.3
Room 98
Indoors 98
Person 97
Furniture 95.2
Person 90.9
Female 85.6
Person 83.6
Dressing Room 78.4
Cabinet 76.7
Woman 76.4
Fashion 72.8
Robe 67
Gown 65.4
Wedding 65.4
Wedding Gown 65.4

Imagga
created on 2021-12-14

boutique 61.3
interior 35.4
home 29.5
room 27.4
dress 27.1
people 26.8
groom 26.3
couple 23.5
house 23.4
kitchen 22.5
bride 20.5
modern 20.3
adult 19.7
wedding 19.3
indoor 19.2
luxury 18.9
person 18.5
indoors 18.5
man 17.5
furniture 15.9
love 15.8
happy 15.7
shop 15.2
celebration 14.4
male 14.2
happiness 14.1
smiling 13.7
apartment 13.4
style 13.4
table 13.3
marriage 13.3
decor 13.3
fashion 12.8
women 12.7
new 12.1
window 12
married 11.5
cabinet 11.5
bouquet 11.3
two 11
elegance 10.9
glass 10.9
barbershop 10.8
husband 10.5
decoration 10.1
inside 10.1
holiday 10
smile 10
family 9.8
chair 9.6
design 9.6
wife 9.5
stove 9.5
architecture 9.4
domestic 9.4
wood 9.2
outfit 9
cheerful 8.9
gown 8.9
ceremony 8.7
flowers 8.7
lifestyle 8.7
pretty 8.4
floor 8.4
mercantile establishment 8.3
bedroom 8.3
fun 8.2
stylish 8.1
lady 8.1
looking 8
day 7.8
counter 7.8
standing 7.8
portrait 7.8
flower 7.7
estate 7.6
togetherness 7.6
traditional 7.5
tradition 7.4
light 7.4
food 7.3
suit 7.2
together 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 99.5
indoor 98.1
wedding dress 89.3
text 88.5
dress 88.4
person 85.1
sink 81.9
bride 76.9
woman 68.8
clothing 67.1
mirror 65.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Male, 71.8%
Calm 93.1%
Sad 5.5%
Angry 0.6%
Confused 0.4%
Surprised 0.2%
Happy 0.2%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 19-31
Gender Female, 93.1%
Fear 84%
Sad 4.2%
Angry 3.4%
Disgusted 2.4%
Surprised 1.8%
Confused 1.4%
Happy 1.4%
Calm 1.3%

AWS Rekognition

Age 23-35
Gender Female, 97.3%
Calm 79.2%
Sad 16.9%
Fear 2.1%
Surprised 0.4%
Confused 0.4%
Angry 0.4%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 22-34
Gender Female, 97.3%
Calm 68.4%
Sad 13.5%
Surprised 11.1%
Fear 2%
Happy 1.8%
Confused 1.3%
Disgusted 1.1%
Angry 0.8%

AWS Rekognition

Age 23-35
Gender Male, 84.1%
Calm 28.4%
Angry 18.1%
Surprised 17.7%
Confused 15.9%
Happy 8.3%
Sad 7.3%
Fear 2.2%
Disgusted 2.1%

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Wedding Gown 65.4%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 95.4%
a person standing in front of a mirror 95.3%
a man and a woman standing in front of a mirror 86.1%

Text analysis

Amazon

55%
55% P69
K
P69
55

Google

55
5%
55 5%