Human Generated Data

Title

Untitled (woman playing piano, Heinz Ocean Pier)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8379

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano, Heinz Ocean Pier)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8379

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Clothing 86.3
Apparel 86.3
Accessories 78.5
Accessory 78.5
Female 77.7
Indoors 68.6
Person 68.1
Jewelry 67.1
Portrait 63.7
Face 63.7
Photo 63.7
Photography 63.7
Girl 62.5
Text 58.4
Flooring 56.6

Clarifai
created on 2023-10-25

people 99.9
one 99.2
woman 98.7
adult 98.5
indoors 95.2
monochrome 95
music 94.4
room 94.1
man 93.7
piano 92.1
art 91.6
dressing room 91.4
wear 91.2
sit 90.6
portrait 90
furniture 88.5
veil 87.6
two 86.4
actress 85.1
mirror 83

Imagga
created on 2022-01-09

person 37.7
adult 30.7
man 30.2
people 27.9
professional 27.7
work 24.3
male 24.3
office 24
home 22.3
indoors 20.2
business 19.4
smiling 18.8
computer 18.1
lifestyle 18.1
job 17.7
working 17.7
smile 15.7
newspaper 15.7
worker 15.6
desk 15.1
happy 15
sitting 14.6
laptop 14.6
table 13.8
men 13.7
businesswoman 13.6
casual 13.6
one 13.4
room 13.1
corporate 12.9
indoor 12.8
negative 12.4
interior 12.4
businessman 12.4
portrait 12.3
looking 12
pretty 11.9
instrument 11.8
kitchen 11.6
teacher 11.5
house 10.9
cheerful 10.6
black 10.6
attractive 10.5
scholar 10.2
guitar 10.2
occupation 10.1
alone 10
holding 9.9
product 9.9
film 9.8
human 9.7
lady 9.7
clothing 9.7
educator 9.6
standing 9.6
happiness 9.4
executive 9.4
manager 9.3
face 9.2
phone 9.2
success 8.9
monitor 8.6
one person 8.5
modern 8.4
horizontal 8.4
color 8.3
life 8.3
intellectual 8.2
medical 7.9
women 7.9
couple 7.8
creation 7.6
two 7.6
talking 7.6
hand 7.6
photographic paper 7.6
communication 7.6
fashion 7.5
guy 7.4
confident 7.3
sexy 7.2
music 7.2
hair 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.7
person 99
wall 96.8
indoor 94.2
black and white 90.9
standing 76.8
clothing 75.9
posing 69.5
piano 62.8
book 60.2
monochrome 54.8
old 42

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 80.4%
Surprised 88.2%
Happy 9%
Calm 1.7%
Fear 0.3%
Disgusted 0.3%
Confused 0.2%
Sad 0.2%
Angry 0.1%

AWS Rekognition

Age 43-51
Gender Male, 61.5%
Sad 46.9%
Disgusted 28.2%
Surprised 10.9%
Fear 4.8%
Angry 4.1%
Happy 2.7%
Calm 2%
Confused 0.6%

AWS Rekognition

Age 40-48
Gender Female, 91.2%
Sad 93%
Happy 2.4%
Confused 2%
Calm 1.2%
Fear 0.5%
Angry 0.3%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 36-44
Gender Female, 53.6%
Calm 56.4%
Surprised 23.3%
Angry 14.3%
Sad 2.2%
Happy 1.5%
Confused 1.1%
Disgusted 0.7%
Fear 0.6%

AWS Rekognition

Age 19-27
Gender Female, 82.4%
Calm 89.5%
Fear 3.6%
Sad 3.4%
Happy 1.7%
Confused 0.7%
Angry 0.4%
Disgusted 0.4%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

17254.
19254.
MAGON

Google

17254. YT37A8-AAMTPA 19254. 17254.
17254.
YT37A8-AAMTPA
19254.