Human Generated Data

Title

Untitled (woman and child seated at piano)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4430

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman and child seated at piano)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4430

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.6
Human 98.6
Person 91
Person 89.8
Person 88.8
Person 88.5
Classroom 88
School 88
Room 88
Indoors 88
Person 84.6
Crowd 81.3
Person 80.3
Person 78.6
Interior Design 78.3
Face 76.9
People 72.3
Clothing 68
Apparel 68
Audience 66.7
Person 66.2
Cafeteria 63.1
Restaurant 63.1
Person 59
Person 58.3
Female 56.2
Person 53.4

Clarifai
created on 2023-10-27

people 99.8
many 99.3
group 97.2
adult 95.7
man 95.5
group together 94.8
crowd 94.2
audience 91.5
woman 91.5
music 91.3
administration 90.2
musician 88.7
wear 88
leader 82.6
party 82.4
education 81
monochrome 79.9
ceremony 79.8
celebration 78.2
desktop 78.1

Imagga
created on 2022-01-23

daily 44.9
newspaper 43.4
product 34
creation 25.1
business 21.2
people 20.1
businessman 18.5
work 18.3
man 18.1
team 15.2
technology 14.1
person 13.7
male 13.5
wagon 13.4
room 13.2
group 12.9
professional 12.2
teamwork 12
adult 11.8
shop 11.8
table 11.4
design 11.2
office 11.2
computer 11.2
modern 11.2
working 10.6
wheeled vehicle 10.1
businesswoman 10
worker 9.9
job 9.7
building 9.6
men 9.4
doctor 9.4
glass 9.3
finance 9.3
medical 8.8
medicine 8.8
setting 8.7
ancient 8.6
luxury 8.6
dinner 8.4
manager 8.4
event 8.3
city 8.3
wedding 8.3
decoration 8
indoors 7.9
mercantile establishment 7.9
collage 7.7
health 7.6
meeting 7.5
company 7.4
style 7.4
occupation 7.3
successful 7.3
laptop 7.3
looking 7.2
stall 7.1
copy 7.1
architecture 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.6
text 98.5
clothing 72.9
flower 71.5
player 68.9
table 66.1
woman 65.3
group 58.3
posing 50.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-47
Gender Female, 52.9%
Calm 51.1%
Fear 45.2%
Surprised 1.6%
Happy 0.9%
Disgusted 0.4%
Angry 0.3%
Sad 0.3%
Confused 0.2%

AWS Rekognition

Age 26-36
Gender Female, 95.8%
Sad 57.8%
Angry 20.4%
Surprised 8.2%
Confused 5.2%
Happy 3.8%
Disgusted 1.9%
Fear 1.5%
Calm 1.2%

AWS Rekognition

Age 34-42
Gender Female, 69.3%
Calm 54.6%
Surprised 39.7%
Sad 1.6%
Happy 1.1%
Angry 0.9%
Confused 0.8%
Fear 0.7%
Disgusted 0.6%

AWS Rekognition

Age 42-50
Gender Male, 97.5%
Happy 53.4%
Sad 18.9%
Surprised 17.9%
Angry 3.3%
Confused 2.4%
Fear 1.8%
Disgusted 1.4%
Calm 1%

AWS Rekognition

Age 37-45
Gender Female, 98.9%
Happy 79.1%
Calm 9.7%
Fear 3.7%
Sad 2.9%
Surprised 1.5%
Angry 1.2%
Disgusted 1%
Confused 0.9%

AWS Rekognition

Age 28-38
Gender Female, 99.7%
Sad 68.4%
Calm 9.9%
Angry 8.9%
Surprised 6.2%
Confused 2.6%
Fear 2.6%
Happy 0.9%
Disgusted 0.6%

AWS Rekognition

Age 34-42
Gender Female, 96.1%
Happy 70.3%
Calm 10.9%
Sad 9.3%
Disgusted 3%
Confused 2.5%
Surprised 1.8%
Angry 1.3%
Fear 0.8%

AWS Rekognition

Age 30-40
Gender Female, 80.5%
Sad 26.4%
Calm 22.7%
Happy 17.1%
Confused 11.4%
Fear 9.3%
Surprised 5.4%
Disgusted 3.9%
Angry 3.8%

Feature analysis

Amazon

Person 98.6%

Categories

Imagga

paintings art 96.4%
beaches seaside 2.1%

Text analysis

Amazon

17247.
19247.
ATOM

Google

17247.
19247
17247. 田 田 19247