Human Generated Data

Title

Untitled (two women sitting for portrait on bench in studio)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1573

Human Generated Data

Title

Untitled (two women sitting for portrait on bench in studio)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Apparel 99.4
Clothing 99.4
Human 98.5
Person 98.5
Person 98.3
Dress 97.2
Female 95.5
Face 90.4
Woman 84.8
People 72.5
Fashion 72.1
Girl 71.2
Gown 68.1
Portrait 67.9
Photo 67.9
Photography 67.9
Robe 61.5
Suit 61.4
Coat 61.4
Overcoat 61.4
Wedding 58
Door 57.5
Costume 56.7
Bridegroom 56.5

Imagga
created on 2021-12-14

negative 41.1
film 31.8
photographic paper 23.8
portrait 22
person 20.9
people 20.1
bride 19.9
adult 19.6
clothing 18.1
happiness 17.2
dress 17.2
fashion 16.6
photographic equipment 15.9
face 14.9
wedding 14.7
art 14.7
attractive 14.7
love 14.2
groom 13.8
pretty 13.3
happy 12.5
cute 12.2
fountain 12.1
model 11.7
posing 11.5
male 11.3
couple 11.3
human 11.2
blond 11.2
looking 11.2
black 10.8
man 10.7
lady 10.5
women 10.3
lifestyle 10.1
smile 10
one 9.7
shower cap 9.7
mother 9.6
hair 9.5
jacket 9.4
casual 9.3
head 9.2
elegance 9.2
gown 9
structure 8.8
costume 8.8
home 8.8
look 8.8
flowers 8.7
smiling 8.7
bouquet 8.5
cap 8.5
house 8.4
joy 8.4
alone 8.2
sculpture 8.2
romantic 8
family 8
statue 7.9
nurse 7.8
tenderness 7.8
color 7.8
bridal 7.8
eyes 7.7
window 7.7
hairstyle 7.6
skin 7.6
vintage 7.4
retro 7.4
sensuality 7.3
decoration 7.2
stylish 7.2
sexy 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.9
clothing 97
person 94.6
window 93.7
dress 88
woman 87.9
human face 84
smile 80.7
newspaper 77.3
old 76.5
wedding dress 74.6
black and white 71
bride 57.5
posing 42.2

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 69.2%
Sad 46.4%
Calm 32.1%
Happy 11.7%
Fear 4%
Angry 2.9%
Confused 1.2%
Surprised 1.1%
Disgusted 0.7%

AWS Rekognition

Age 34-50
Gender Male, 89%
Calm 69.6%
Happy 22.8%
Sad 3.6%
Angry 1.8%
Surprised 1.1%
Confused 0.9%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

an old photo of a man 92.6%
old photo of a man 91%
an old photo of a man in a newspaper 76.6%

Text analysis

Amazon

FILM
AGEANITRATE FILM
AGEANITRATE

Google

AGEA NITRATE
AGEA
NITRATE