Human Generated Data

Title

Untitled (woman playing piano and singing)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4443

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano and singing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Imagga
created on 2022-01-23

newspaper 41.4
business 33.4
person 32.3
adult 31.3
professional 30.4
product 29.2
man 26.9
businessman 26.5
people 26.2
office 25.1
corporate 24.9
male 24.2
laptop 23.7
creation 22.6
computer 21.6
job 21.2
businesswoman 20.9
work 20.4
indoors 19.3
happy 16.9
building 16.7
scholar 16.2
working 15.9
executive 15.2
worker 15.2
successful 14.6
indoor 14.6
looking 14.4
smile 14.3
attractive 14
manager 14
sax 13.7
suit 13.5
businesspeople 13.3
clothes 13.1
room 13
intellectual 12.9
success 12.9
clothing 12.8
casual 12.7
technology 12.6
career 12.3
adults 12.3
smiling 12.3
face 12.1
men 12
home 12
modern 11.9
portrait 11.6
wind instrument 11.4
notebook 11.3
pretty 11.2
20s 11
confident 10.9
communication 10.9
teacher 10.8
one 10.5
phone 10.1
handsome 9.8
education 9.5
meeting 9.4
lifestyle 9.4
musical instrument 9.4
daily 9.3
finance 9.3
horizontal 9.2
occupation 9.2
alone 9.1
full length 8.7
women 8.7
corporation 8.7
day 8.6
boss 8.6
ethnic 8.6
desk 8.5
black 8.4
company 8.4
holding 8.3
window 8.2
hair 7.9
oboe 7.9
table 7.8
color 7.8
employee 7.7
concentration 7.7
sitting 7.7
daytime 7.7
monitor 7.7
profession 7.7
fashion 7.5
one person 7.5
city 7.5
teamwork 7.4
focus 7.4
inside 7.4
group 7.3
team 7.2
information 7.1

Microsoft
created on 2022-01-23

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 57%
Calm 69%
Sad 26.6%
Confused 1.2%
Surprised 0.8%
Fear 0.8%
Angry 0.6%
Disgusted 0.6%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Piano 71.3%

Captions

Microsoft

a person standing in front of a laptop 43.4%
a person sitting at a desk 43.3%
a person standing in front of a computer 43.2%

Text analysis

Amazon

17255
17255.
MAOOX

Google

5.
17
2005
17 2005 5. 11255. 11255.
11255.