Human Generated Data

Title

Untitled (woman and young boy seated at piano)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4453

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman and young boy seated at piano)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4453

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 93.7
Human 93.7
Restaurant 93.5
Person 92.1
Person 88.2
Crowd 86.3
Cafeteria 85.2
Person 84.2
Person 83.1
Audience 81.3
Meal 78.5
Food 78.5
Person 77.8
Furniture 74.6
Person 71.3
People 71
Chair 69.3
Indoors 68.5
Room 68.1
Table 65.5
Person 65.4
Person 65.2
Clothing 64
Apparel 64
Person 63.3
Female 62.8
Face 60.1
Girl 59.8
Cafe 56.9
Classroom 56.2
School 56.2
Person 51.8
Person 42.2

Clarifai
created on 2023-10-26

people 99.7
many 98.9
man 96.9
adult 96
group 95.2
group together 92
monochrome 91.9
administration 90.8
wear 85.3
war 85
woman 84.6
crowd 82
leader 80.7
music 80.5
interaction 78.1
musician 77.3
desktop 76
illustration 75.6
audience 74.3
military 73.7

Imagga
created on 2022-01-23

newspaper 64
daily 55.8
product 52.8
creation 40.7
business 15.8
people 14.5
old 13.9
person 12.5
man 11.4
male 11.3
money 11
cash 11
wealth 10.8
history 10.7
work 10.6
art 10.3
vintage 9.9
adult 9.8
ancient 9.5
rich 9.3
dollar 9.3
banking 9.2
antique 9.1
bank 9
luxury 8.6
grunge 8.5
finance 8.4
design 8.4
web site 8.3
city 8.3
drawing 8.2
currency 8.1
financial 8
building 7.9
philately 7.9
postmark 7.9
paper 7.8
portrait 7.8
stamp 7.7
mail 7.7
card 7.6
exchange 7.6
technology 7.4
retro 7.4
letter 7.3
decoration 7.3
aged 7.2
celebration 7.2
worker 7.1
medical 7.1
travel 7
professional 7

Google
created on 2022-01-23

Black 89.6
Black-and-white 85.3
Style 83.8
Font 78.8
Monochrome photography 76.9
Monochrome 75.9
Motor vehicle 74.7
Chair 71.8
Art 69
Hat 66.6
Event 66.2
Room 65.3
Stock photography 64.3
Illustration 57.1
Photographic paper 55.7
History 55.6
Machine 55.2
Pattern 53.5
Desk 50.3

Microsoft
created on 2022-01-23

text 99.4
person 95.9
player 74.5
old 62.6
posing 57.5
white goods 50.4
female 26.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 77.1%
Calm 61.2%
Confused 20.9%
Surprised 5.5%
Angry 3.9%
Happy 3.8%
Sad 2.2%
Disgusted 1.8%
Fear 0.7%

AWS Rekognition

Age 48-56
Gender Female, 95.5%
Happy 48.5%
Fear 31.5%
Surprised 6.8%
Confused 5.3%
Sad 3.3%
Angry 2.1%
Disgusted 1.8%
Calm 0.8%

AWS Rekognition

Age 25-35
Gender Female, 95.5%
Calm 92.6%
Happy 5.6%
Surprised 0.8%
Disgusted 0.4%
Angry 0.2%
Confused 0.2%
Sad 0.2%
Fear 0.1%

AWS Rekognition

Age 28-38
Gender Male, 80.3%
Calm 78.1%
Happy 11.8%
Surprised 3.8%
Disgusted 2.2%
Angry 1.4%
Sad 1.1%
Fear 1%
Confused 0.6%

AWS Rekognition

Age 23-31
Gender Male, 68.1%
Calm 44.3%
Surprised 21.1%
Sad 13.6%
Fear 8.6%
Confused 8.5%
Disgusted 2.2%
Happy 1.1%
Angry 0.6%

AWS Rekognition

Age 19-27
Gender Female, 99.8%
Calm 95.1%
Happy 1.4%
Sad 1.1%
Surprised 0.7%
Confused 0.6%
Disgusted 0.4%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 51-59
Gender Male, 98.3%
Happy 95.5%
Surprised 2%
Angry 1%
Confused 0.4%
Sad 0.3%
Calm 0.3%
Fear 0.3%
Disgusted 0.1%

AWS Rekognition

Age 26-36
Gender Male, 60.4%
Fear 94.4%
Sad 3.4%
Calm 0.6%
Surprised 0.5%
Confused 0.3%
Happy 0.3%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 21-29
Gender Female, 91.7%
Calm 32.8%
Happy 31.6%
Sad 15.2%
Fear 11.2%
Disgusted 3.8%
Surprised 2.4%
Confused 1.6%
Angry 1.4%

Feature analysis

Amazon

Person 93.7%

Categories

Text analysis

Amazon

17246.
19246.

Google

9246. 17246.
9246.
17246.