Human Generated Data

Title

Untitled (woman and child at piano)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4438

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman and child at piano)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4438

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.8
Human 98.8
Person 97.4
Clothing 90.7
Apparel 90.7
Furniture 79
Face 72
People 70.1
Electronics 66.9
Screen 66.9
LCD Screen 66.9
Monitor 66.9
Display 66.9
Chair 65.6
Sitting 62.8
Female 58.7
Girl 58.3
Studio 57.5
Hair 56.3
Shorts 56.2

Clarifai
created on 2023-10-26

people 99.7
adult 97.2
two 97
woman 96.6
group 95.7
sit 95.7
child 95.5
man 95.1
monochrome 92.7
indoors 92.5
furniture 92.1
music 90.5
room 88.3
family 86.2
three 85.8
chair 84.7
one 84.1
musician 83.4
facial expression 82.5
offspring 82.2

Imagga
created on 2022-01-23

negative 41.9
film 33.1
people 28.4
adult 28.1
photographic paper 25.6
male 22.1
man 21.5
person 19.9
portrait 18.1
indoors 17.6
photographic equipment 17.1
sculpture 14.8
happy 14.4
home 14.4
room 14
toilet 13.8
lifestyle 13.7
face 13.5
smiling 13
men 12.9
bride 12.5
holding 12.4
smile 12.1
looking 12
dress 11.7
toilet tissue 11.6
old 11.1
women 11.1
wedding 11
casual 11
office 11
indoor 11
marble 10.9
house 10.9
professional 10.7
working 10.6
businessman 10.6
worker 10.5
monument 10.3
happiness 10.2
architecture 10.2
building 10.1
clothing 10
statue 9.9
job 9.7
interior 9.7
health 9.7
sitting 9.4
work 9.4
tissue 9.3
business 9.1
pretty 9.1
chair 9
history 8.9
seat 8.9
gown 8.8
celebration 8.8
love 8.7
art 8.6
mature 8.4
color 8.3
child 8.3
fashion 8.3
column 8.2
cheerful 8.1
family 8
day 7.8
modern 7.7
married 7.7
two 7.6
bouquet 7.5
human 7.5
traditional 7.5
one 7.5
camera 7.4
groom 7.4
historic 7.3
20s 7.3
alone 7.3
religion 7.2
bright 7.1
furniture 7.1
paper 7.1
medical 7.1
antique 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 83.6%
Calm 79.4%
Angry 10.4%
Happy 7.1%
Sad 1.4%
Confused 0.5%
Disgusted 0.5%
Fear 0.4%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

19250.
17250.
BE
MANTRAT
AAGGN YT37AC MANTRAT
AAGGN
YT37AC

Google

19250.
I
17250
19250. zしI 17250
z