Human Generated Data

Title

Untitled (woman sitting at piano)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19514

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman sitting at piano)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19514

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clarifai
created on 2023-10-23

people 99.9
music 98.9
piano 98.8
monochrome 98.4
adult 98.2
one 98
street 97.7
man 97.7
musician 96.6
art 95.6
woman 95.1
instrument 92.7
jazz 92.6
two 92.1
portrait 91.8
wear 89.7
indoors 88.9
shadow 84.9
light 83.9
black and white 83.8

Imagga
created on 2022-03-05

upright 100
piano 100
stringed instrument 88.8
keyboard instrument 88.4
percussion instrument 83.8
musical instrument 73.2
grand piano 20.2
music 18.2
man 18.1
people 16.7
adult 16.2
person 14.5
chair 13.7
male 13.5
black 13.2
playing 12.8
old 12.5
interior 12.4
industrial 11.8
working 11.5
attractive 11.2
building 11.2
inside 11
worker 10.7
light 10.7
job 10.6
musical 10.5
metal 10.5
keyboard 10.3
business 10.3
work 10.3
industry 10.2
fashion 9.8
indoors 9.7
instrument 9.6
education 9.5
play 9.5
men 9.4
sound 9.4
city 9.1
hand 9.1
art 9.1
one 9
wind instrument 8.9
classical 8.6
sitting 8.6
child 8.3
safety 8.3
free-reed instrument 8.2
seat 8.2
style 8.2
religion 8.1
musician 7.9
urban 7.9
modern 7.7
skill 7.7
culture 7.7
repair 7.7
dark 7.5
human 7.5
vintage 7.4
holding 7.4
technology 7.4
room 7.3
laptop 7.3
protection 7.3
lifestyle 7.2
sexy 7.2
computer 7.2
suit 7.2
smile 7.1
portrait 7.1
steel 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.5
black and white 97.2
person 92.8
street 92.8
monochrome 89.5
clothing 65
piano 59.8

Color Analysis

Feature analysis

Amazon

Person
Hat
Person 98.7%
Hat 97%

Captions

Text analysis

Amazon

3
EB
11.30
EB BASE
11.30 43
43
BASE
We
We in
in

Google

YJUT.39AAH2.H.d. .E# EB 2VEE
YJUT.39AAH2.H.d.
.E#
EB
2VEE