Human Generated Data

Title

Untitled (four women standing near mirror at bottom of staircase)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5315

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women standing near mirror at bottom of staircase)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.5
Apparel 99.5
Human 99.1
Person 99.1
Person 98.7
Person 97.9
Person 97.5
Female 90.8
Dress 88.5
Woman 79.2
Gown 78.4
Fashion 78.4
Evening Dress 78.4
Robe 78.4
People 76.8
Bridesmaid 74.8
Wedding 74.8
Person 62.1
Drawing 59
Art 59
Girl 58.6
Wedding Gown 57.2
Sketch 56.9

Imagga
created on 2022-01-22

boutique 34.1
people 24.5
fashion 20.3
negative 19.5
dress 19
art 18.7
film 17.5
nurse 16.2
clothing 15.9
statue 15.9
person 15.8
male 14.9
attractive 14
adult 13.7
man 13.4
sculpture 13
human 12.7
bride 12.4
lady 12.2
photographic paper 12
style 11.9
old 11.8
model 11.7
portrait 11.6
face 11.4
design 11.2
women 10.3
business 9.7
body 9.6
couple 9.6
holding 9.1
black 9
family 8.9
happiness 8.6
shopping 8.6
smile 8.5
monument 8.4
elegance 8.4
city 8.3
wedding 8.3
tourism 8.2
group 8.1
history 8
detail 8
photographic equipment 8
posing 8
groom 7.9
hair 7.9
flowers 7.8
men 7.7
pretty 7.7
bouquet 7.5
figure 7.5
symbol 7.4
outfit 7.4
pose 7.2
color 7.2
sexy 7.2
marble 7.2
looking 7.2
celebration 7.2
religion 7.2
love 7.1
antique 7.1
paper 7.1
travel 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

dress 94.5
text 93.6
clothing 85.2
window 84.2
woman 80.1
sketch 78
person 67.8
white 60.1
posing 54
clothes 30

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 99.3%
Calm 77.1%
Disgusted 7.3%
Happy 4.4%
Confused 3.9%
Sad 2.7%
Angry 2.1%
Surprised 2%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people posing for a photo 89.8%
a group of people posing for the camera 89.7%
a group of people posing for a picture 89.6%

Text analysis

Amazon

6507

Google

6509
6509