Human Generated Data

Title

Untitled (man seated at piano in front of audience of men)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8229

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man seated at piano in front of audience of men)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8229

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.2
Human 99.2
Person 98.9
Person 98.7
Person 98.4
Person 98.1
Person 94.7
Person 92.4
Person 91.3
Person 87.7
Person 80.8
People 74.9
Clothing 66
Apparel 66
Person 65.8
Crowd 60.3
Suit 57.6
Coat 57.6
Overcoat 57.6
Person 48.7

Clarifai
created on 2023-10-25

people 99.8
group 97.7
woman 97.1
many 96.5
adult 96.4
education 95.4
man 95.3
group together 92.8
child 92.4
furniture 92.2
room 92.1
music 90.9
indoors 90.1
administration 90
war 87.2
boy 86.6
school 85.9
desk 85.4
recreation 85.3
musician 83.9

Imagga
created on 2022-01-08

room 28.1
classroom 26.9
people 26.7
man 25.7
blackboard 24.8
person 23.5
male 21.3
home 20.7
computer 19.7
business 17.6
indoors 17.6
office 17.3
lifestyle 16.6
newspaper 15.7
adult 15.2
education 14.7
old 14.6
sitting 14.6
laptop 14.4
smiling 13.7
indoor 13.7
women 13.4
happy 13.1
men 12.9
student 12.3
senior 12.2
work 11.8
desk 11.6
interior 11.5
table 11.5
couple 11.3
product 10.8
center 10.6
class 10.6
one 10.4
teacher 10.4
black 10.2
casual 10.2
horizontal 10
working 9.7
businessman 9.7
group 9.7
together 9.6
house 9.2
cheerful 8.9
creation 8.5
communication 8.4
mature 8.4
vintage 8.3
board 8.1
family 8
looking 8
school 7.9
art 7.9
barbershop 7.8
portrait 7.8
screen 7.7
monitor 7.6
talking 7.6
adults 7.6
relaxation 7.5
meeting 7.5
retro 7.4
daily 7.3
aged 7.2
handsome 7.1
to 7.1
shop 7.1
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.6
piano 87.8
person 86.8
people 74.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Male, 98.3%
Sad 86%
Calm 11.7%
Happy 1.3%
Confused 0.3%
Angry 0.2%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 78.8%
Calm 97.1%
Sad 1.2%
Confused 0.7%
Happy 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 37-45
Gender Female, 65.9%
Calm 93.6%
Sad 4.1%
Angry 0.7%
Confused 0.5%
Happy 0.4%
Disgusted 0.3%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Female, 59.3%
Calm 96%
Sad 1.6%
Surprised 1.5%
Confused 0.3%
Disgusted 0.2%
Happy 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Male, 86.4%
Calm 75.4%
Sad 19.1%
Happy 3.4%
Confused 0.9%
Angry 0.5%
Fear 0.3%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 13-21
Gender Male, 99.6%
Sad 77.2%
Angry 11.5%
Calm 9.9%
Confused 0.4%
Fear 0.4%
Disgusted 0.3%
Happy 0.2%
Surprised 0.2%

AWS Rekognition

Age 50-58
Gender Male, 97.9%
Calm 96.8%
Sad 1%
Happy 0.6%
Surprised 0.4%
Disgusted 0.4%
Angry 0.3%
Confused 0.3%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Female, 53.4%
Sad 99.7%
Calm 0.2%
Confused 0%
Fear 0%
Angry 0%
Disgusted 0%
Happy 0%
Surprised 0%

AWS Rekognition

Age 18-26
Gender Male, 97.4%
Sad 98.2%
Calm 0.7%
Confused 0.6%
Happy 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%

AWS Rekognition

Age 16-24
Gender Female, 89.6%
Sad 57%
Angry 25.8%
Calm 9.3%
Confused 2.4%
Fear 1.8%
Disgusted 1.5%
Happy 1.5%
Surprised 0.7%

AWS Rekognition

Age 16-24
Gender Female, 80.6%
Calm 96%
Sad 3%
Happy 0.4%
Confused 0.3%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

PLAUSE
7629
MJ17
7629.
MJ17 YYT37A2
YYT37A2

Google

LAUSE 7629 9629.
LAUSE
7629
9629.