Human Generated Data

Title

Untitled (woman playing piano)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4458

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4458

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 97.8
Human 97.8
Clothing 74.5
Apparel 74.5
Female 72.5
Photography 68.7
Photo 68.7
Face 67.3
Portrait 67.3
Girl 65.9
Suit 62.8
Overcoat 62.8
Coat 62.8
People 60.1
Furniture 59.7
Glass 57.2
Person 56.3

Clarifai
created on 2023-10-26

people 99.5
adult 98.8
monochrome 97.7
group 95.8
man 95.3
woman 94.9
wear 94.3
two 93.8
musician 93.8
music 93.7
outfit 90.8
furniture 90.7
one 89.7
administration 87.5
chair 87
illustration 84.7
indoors 83.7
singer 82.9
three 82.6
actress 82.4

Imagga
created on 2022-01-23

negative 28.6
film 24.6
architecture 23.5
history 17
photographic paper 16.8
building 16.1
house 13.4
old 13.2
structure 12.2
city 11.6
photographic equipment 11.3
art 11.3
people 11.1
historic 11
symbol 10.8
design 10.7
glass 10.7
business 10.3
menorah 9.9
equipment 9.6
laboratory 9.6
urban 9.6
construction 9.4
historical 9.4
water 9.3
monument 9.3
sculpture 9.2
statue 9.1
religion 9
shop 8.9
medicine 8.8
lab 8.7
ancient 8.6
drawing 8.6
finance 8.4
silhouette 8.3
man 8.3
bakery 8
light 8
science 8
candelabrum 7.9
scientist 7.8
liquid 7.8
scientific 7.7
chemistry 7.7
men 7.7
chemical 7.7
life 7.6
research 7.6
hand 7.6
biology 7.6
cityscape 7.6
office 7.5
vintage 7.4
tourism 7.4
mercantile establishment 7.4
landmark 7.2
home 7.2
work 7.2
tower 7.2
male 7.1
marble 7.1
medical 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.8
black and white 89
person 86.5
clothing 66.2
drawing 62.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 69.4%
Calm 88.5%
Sad 4.4%
Surprised 3.1%
Confused 1.2%
Happy 1.1%
Angry 0.7%
Disgusted 0.5%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%

Categories

Imagga

paintings art 94.4%
interior objects 5.5%

Text analysis

Amazon

19242.
17242.
BC
HAUGH

Google

19242•
19242•