Human Generated Data

Title

Untitled (wedding reception)

Date

1958, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.115

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Chair 99.8
Furniture 99.8
Person 99.3
Human 99.3
Clothing 99.1
Apparel 99.1
Chair 99
Person 99
Person 99
Person 97.5
Accessories 89.1
Accessory 89.1
Jewelry 87.9
Robe 87.5
Fashion 87.5
Gown 87.2
Female 83
Evening Dress 81.7
Wedding 69.2
Woman 67.7
Wedding Gown 62.5
Tiara 59

Imagga
created on 2022-01-08

groom 51.7
bride 33
couple 29.6
wedding 28.5
person 27.8
people 27.3
dress 26.2
love 23.7
adult 22.7
portrait 22.6
happy 22.6
man 21.5
male 20.1
bouquet 18.9
married 18.2
two 17.8
happiness 17.2
family 16
home 16
attractive 15.4
marriage 14.2
lady 13.8
smiling 13.7
wife 13.3
interior 13.3
husband 12.9
fashion 12.8
black 12.8
women 12.6
romantic 12.5
smile 12.1
mother 12
room 12
looking 12
pretty 11.9
gown 11.8
bridal 11.7
world 11.5
clothing 11.5
together 11.4
flowers 11.3
men 11.2
wed 10.8
veil 10.8
romance 10.7
ceremony 10.7
human 10.5
office 10.4
sexy 10.4
celebration 10.4
sitting 10.3
future 10.2
lifestyle 10.1
elegance 10.1
professional 9.8
child 9.8
old 9.8
cheerful 9.8
window 9.4
face 9.2
indoor 9.1
life 8.9
indoors 8.8
brunette 8.7
day 8.6
senior 8.4
house 8.4
gorgeous 8.2
new 8.1
bow tie 8.1
business 7.9
nurse 7.6
pair 7.6
mature 7.4
holding 7.4
style 7.4
light 7.4
alone 7.3
girls 7.3
aged 7.2
lovely 7.1
garment 7.1
hairdresser 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.7
wedding dress 98.9
bride 98.2
dress 97.2
text 97
woman 94.6
wedding 90.9
standing 77.4
clothing 73.4
picture frame 9.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Happy 45.3%
Calm 20.4%
Disgusted 9.8%
Sad 8%
Angry 6.1%
Surprised 4.6%
Fear 3.1%
Confused 2.7%

AWS Rekognition

Age 38-46
Gender Female, 100%
Sad 95.1%
Happy 2.7%
Disgusted 0.6%
Calm 0.4%
Fear 0.4%
Angry 0.3%
Confused 0.3%
Surprised 0.2%

AWS Rekognition

Age 16-22
Gender Female, 83.7%
Angry 29.6%
Sad 29.4%
Fear 23%
Calm 9.3%
Confused 3.8%
Disgusted 2.1%
Surprised 1.4%
Happy 1.3%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 45
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.8%
Person 99.3%

Captions

Microsoft

a woman standing in front of a mirror posing for the camera 86.5%
a woman standing in front of a mirror 86.4%
a woman standing in front of a window 85.9%