Human Generated Data

Title

Untitled (women from circus in train car)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8671

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women from circus in train car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.1
Human 98.1
Person 97.9
Furniture 97.3
Person 94.3
Bed 79.6
Chair 73.9
Apparel 66.6
Clothing 66.6
Face 62.7
Photography 61.3
Photo 61.3
Portrait 61.3
Indoors 60.7
Room 60.7
Person 60.2
Finger 59.9
Woman 59.5
Blonde 59.5
Girl 59.5
Child 59.5
Teen 59.5
Kid 59.5
Female 59.5
Bedroom 56.5
Acrobatic 55.6
Underwear 55.2

Imagga
created on 2022-01-09

groom 31.1
man 22.8
person 21.3
people 21.2
portrait 18.8
bride 18.3
adult 18.2
shower curtain 18
love 17.4
male 17
face 16.3
wedding 15.6
curtain 15.4
sexy 14.4
body 14.4
couple 13.1
lifestyle 13
human 12.7
dress 12.6
black 12.6
pretty 12.6
happy 11.3
furnishing 11.2
women 11.1
blind 10.8
married 10.5
hair 10.3
strength 10.3
two 10.2
smile 10
bridal 9.7
training 9.2
attractive 9.1
sensuality 9.1
veil 8.8
indoors 8.8
exercising 8.7
eyes 8.6
life 8.6
health 8.3
fashion 8.3
device 8.3
looking 8
smiling 8
protective covering 7.9
happiness 7.8
confidence 7.7
hospital 7.5
cheerful 7.3
lady 7.3
home 7.2
gown 7.2
clothing 7.1
hairdresser 7.1

Google
created on 2022-01-09

Photograph 94.3
Arm 94.3
White 92.2
Muscle 91.7
Black 89.8
Human 89.2
Black-and-white 86.2
Gesture 85.3
Style 84.2
T-shirt 78.3
Monochrome 76.6
Monochrome photography 76.3
Snapshot 74.3
Art 71.9
Elbow 71.5
Hat 70.8
Font 70.3
Shorts 69.4
Room 66.9
Stock photography 66.7

Microsoft
created on 2022-01-09

text 99.6
black and white 84.7
human face 74.8
clothing 69.1
person 68.7

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 99.7%
Happy 54.2%
Surprised 43.7%
Confused 0.9%
Calm 0.7%
Disgusted 0.2%
Angry 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 96.4%
Happy 88.6%
Surprised 10.4%
Calm 0.5%
Angry 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Female, 95.6%
Happy 92.6%
Surprised 6.1%
Fear 0.5%
Calm 0.2%
Angry 0.2%
Disgusted 0.2%
Confused 0.1%
Sad 0.1%

AWS Rekognition

Age 35-43
Gender Female, 70.3%
Happy 89.9%
Sad 7.6%
Calm 1.2%
Surprised 0.5%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%

Captions

Microsoft

text 99.7%

Text analysis

Amazon

SIESTA
KEY,
22900
STEINMETZ,
SARASOTA,
J.J.
22900 J.J. STEINMETZ, SIESTA KEY, SARASOTA, FLA.
FLA.

Google

22900
J.
SIESTA
KEY,
22900 J. J. STEINMETZ, SIESTA KEY, SARASOTA,FLA.
STEINMETZ,
SARASOTA,FLA.