Human Generated Data

Title

Untitled (woman in coveralls and hat)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1935

Human Generated Data

Title

Untitled (woman in coveralls and hat)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99
Apparel 99
Human 98
Person 98
Footwear 79.7
Shoe 79.7
Dress 77.4
Art 68.4
Female 66.1
Face 65.9
Coat 61.2
Fashion 61
Sleeve 58.1

Imagga
created on 2021-12-14

dress 49.7
negative 37.4
fashion 36.9
portrait 30.4
film 30.3
bride 25.1
model 24.9
face 24.9
person 24.8
attractive 24.5
photographic paper 23.4
posing 23.1
people 22.9
adult 22
elegance 21.8
pretty 21.7
lady 21.1
hair 20.6
clothing 19.8
wedding 19.3
sexy 19.3
sensuality 18.2
statue 16.4
costume 16
studio 16
love 15.8
photographic equipment 15.6
make 15.4
cute 15.1
bag 15
boutique 14.7
veil 14.7
fountain 14.6
plastic bag 14.4
happy 14.4
celebration 14.3
satin 14.3
art 14.2
style 14.1
makeup 13.7
elegant 13.7
hairstyle 13.3
gorgeous 12.7
stylish 12.7
bridal 12.6
structure 12.4
marriage 12.3
gown 11.7
vogue 11.6
lovely 11.6
blond 11.3
container 11.2
sensual 10.9
married 10.5
human 10.5
brunette 10.5
body 10.4
clothes 10.3
smiling 10.1
sculpture 10.1
head 10.1
figure 10
smile 10
romantic 9.8
women 9.5
fashionable 9.5
happiness 9.4
culture 9.4
expression 9.4
joy 9.2
traditional 9.1
cheerful 8.9
detail 8.8
look 8.8
ceremony 8.7
engagement 8.7
decoration 8.7
silk 8.5
feminine 8.4
color 8.3
religion 8.1
flowers 7.8
eyes 7.7
luxury 7.7
holding 7.4
black 7.4
fantasy 7.2
holiday 7.2
bright 7.1
romance 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.9
sketch 98.4
drawing 98.1
painting 95.7
child art 85.7
art 78.1
old 71.5
black and white 65.3
posing 42.2

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 69.5%
Happy 44.9%
Calm 43.9%
Sad 4.4%
Surprised 1.9%
Angry 1.6%
Fear 1.6%
Confused 1.1%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Shoe 79.7%

Captions

Microsoft

a vintage photo of a person 81.6%
a vintage photo of a person 78.9%
a vintage photo of a person wearing a dress 72.9%