Human Generated Data

Title

Untitled (two actresses in robes seated on stage at Hedgerow Theater, PA)

Date

c. 1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12003

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two actresses in robes seated on stage at Hedgerow Theater, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12003

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 98.7
Clothing 98.2
Apparel 98.2
Sitting 95.8
Furniture 85.1
Chair 74.8
Text 71.8
Portrait 64.1
Face 64.1
Photography 64.1
Photo 64.1
Female 62.8
Plant 62.8
Leisure Activities 57.3
Shorts 57.2
Painting 56.1
Art 56.1
Footwear 55.9
Shoe 55.8
Collage 55.5
Advertisement 55.5
Poster 55.5

Clarifai
created on 2023-10-25

people 99.9
woman 98.3
adult 98.1
sit 97.2
elderly 96.4
two 96.4
man 95.4
furniture 95
group 95
wear 94.3
three 93.9
gown (clothing) 91.7
child 90.6
monochrome 90.4
portrait 87.9
one 87.7
administration 87
four 86.6
art 86
chair 82.6

Imagga
created on 2022-01-15

musical instrument 38.1
wind instrument 29.6
sax 26.4
accordion 20.5
man 18.8
fashion 17.3
people 16.7
keyboard instrument 16.5
person 15.2
male 14.9
adult 14.2
stringed instrument 14.1
brass 13.8
bass 13.8
dress 13.5
old 13.2
art 13
face 12.8
device 12.6
portrait 12.3
black 12
bowed stringed instrument 11.8
hair 11.1
city 10.8
clothing 10.5
attractive 10.5
sexy 10.4
youth 10.2
model 10.1
interior 9.7
pretty 9.1
park 9.1
lady 8.9
brunette 8.7
men 8.6
musician 8.6
clothes 8.4
monument 8.4
violin 8.3
robe 8.3
style 8.2
building 7.9
couple 7.8
statue 7.8
architecture 7.8
modern 7.7
elegance 7.6
human 7.5
one 7.5
vintage 7.4
holding 7.4
historic 7.3
guitar 7.3
suit 7.2
looking 7.2
holiday 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.5
person 84.7
clothing 82
drawing 80.1
sketch 68.4
statue 60.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 50.5%
Surprised 45.2%
Angry 1.2%
Happy 1%
Fear 0.7%
Sad 0.6%
Disgusted 0.6%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Female, 67.1%
Calm 52.3%
Surprised 26.5%
Fear 5.8%
Happy 4.5%
Angry 4%
Sad 3.4%
Disgusted 2.7%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Painting 56.1%