Human Generated Data

Title

Untitled (men in sleeping compartment, Harvard Hasty Pudding Club on train to Philadelphia)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4633

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men in sleeping compartment, Harvard Hasty Pudding Club on train to Philadelphia)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Apparel 100
Clothing 100
Person 99
Human 99
Person 98.9
Robe 93.7
Fashion 93.7
Gown 93.1
Wedding 90
Female 85.5
Bride 80.1
Wedding Gown 80.1
Face 74.5
Woman 69.4
Bridegroom 68.6
Indoors 64.3
Blossom 61.2
Flower 61.2
Plant 61.2
Portrait 60.6
Photography 60.6
Photo 60.6
Curtain 60.4
Person 57.8

Imagga
created on 2021-12-14

shower curtain 100
curtain 100
furnishing 97.2
blind 96.1
protective covering 65.6
covering 34.2
bride 27.8
adult 23.9
wedding 23
clothing 21.7
dress 19.9
home 18.3
happiness 18
clothes 17.8
person 17.3
people 17.3
indoors 16.7
veil 15.7
fashion 15.1
portrait 14.9
attractive 14.7
happy 14.4
elegance 14.3
love 14.2
hand 13.7
interior 13.3
room 13.1
face 12.8
gown 12.7
pretty 12.6
bridal 11.7
smiling 11.6
business 11.5
smile 11.4
marriage 11.4
house 10.9
lifestyle 10.8
married 10.5
human 10.5
luxury 10.3
window 10.1
celebration 9.6
women 9.5
wife 9.5
day 9.4
indoor 9.1
modern 9.1
holding 9.1
cheerful 8.9
standing 8.7
life 8.6
men 8.6
male 8.5
professional 8.4
joy 8.4
style 8.2
sexy 8
medical 7.9
cute 7.9
model 7.8
silk 7.7
bouquet 7.5
clean 7.5
child 7.5
one 7.5
emotion 7.4
light 7.4
design 7.3
decoration 7.2
shower 7.2
bright 7.1
decor 7.1
work 7.1
businessman 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wedding dress 99.3
text 98.9
bride 98.1
wedding 79.6
clothing 79.1
black and white 77.9
human face 74.1
curtain 71.3
person 69.3
dress 63.9
smile 56.5
posing 37.8

Face analysis

Amazon

Google

AWS Rekognition

Age 36-54
Gender Male, 61.8%
Sad 83.9%
Happy 5.8%
Calm 5.7%
Fear 1.6%
Surprised 1.5%
Confused 1%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 42-60
Gender Female, 81.6%
Happy 84.3%
Calm 13.2%
Sad 1.8%
Surprised 0.2%
Angry 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 21-33
Gender Female, 65.2%
Sad 68.9%
Happy 15.8%
Calm 13.4%
Angry 0.8%
Confused 0.5%
Fear 0.3%
Surprised 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 49%
a person standing in front of a mirror posing for the camera 33%
a person standing in front of a window 32.9%

Text analysis

Amazon

3248
3248 YT37AS A3013930
.8+
YT37AS
٢٤/5١+
-8"
٢٤/5١+ 39HH - AJIHROT hiast.,HH
A3013930
39HH - AJIHROT
hiast.,HH

Google

YT3342
32A8 YT3342
32A8