Human Generated Data

Title

Untitled (woman playing piano in crowd)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4448

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano in crowd)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4448

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99.2
Clothing 97.6
Apparel 97.6
Person 94
Crowd 91.9
Audience 91.2
Nature 87.9
Person 83.8
People 79.4
Outdoors 79.4
Face 73.7
Person 72.7
Cap 71.7
Person 65.9
Person 64.5
Baseball Cap 60.6
Photography 56.7
Photo 56.7
Hat 51

Clarifai
created on 2023-10-27

people 99.6
many 99.2
crowd 98.6
man 96.8
group 96.7
group together 95.5
adult 94.6
audience 94.3
chair 89.4
leader 88.6
woman 86.7
administration 84.2
war 83.5
meeting 81.6
league 80.5
military 75.1
desktop 69.4
spectator 68.5
sitting 68.5
ceremony 67.5

Imagga
created on 2022-01-23

city 47.4
architecture 37.5
travel 32.4
cityscape 29.3
sky 28.7
town 27.8
stage 27.3
urban 27.1
buildings 26.5
sea 23.6
skyline 22.8
tourism 22.3
platform 22.1
panorama 21.9
structure 21.3
building 21.2
tourist 17.5
tower 17.2
shoreline 17.1
landscape 17.1
downtown 16.3
panoramic 16.3
landmark 16.3
sunset 16.2
water 16
river 16
web site 15.9
aerial 15.5
coast 15.3
famous 14.9
church 14.8
night 14.2
ocean 13.9
scene 13.9
skyscraper 13.6
bay 13.4
old 13.2
spectator 13
sun 12.9
billboard 11.8
houses 11.6
port 11.6
vacation 11.5
boat 11.2
bridge 10.4
signboard 10.2
shore 10.2
exterior 10.1
street 10.1
light 10
scenery 9.9
religion 9.9
history 9.8
hall 9.7
district 9.7
center 9.6
stone 9.5
culture 9.4
horizon 9
scenic 8.8
dusk 8.6
construction 8.6
roof 8.5
clouds 8.5
summer 8.4
silhouette 8.3
seaside 7.9
roofs 7.9
skyscrapers 7.8
modern 7.7
cathedral 7.7
capital 7.6
beach 7.6
destination 7.5
lights 7.4

Google
created on 2022-01-23

Photograph 94.3
White 92.2
Hat 90.6
Black 89.8
Black-and-white 83.1
Line 82.1
Font 81.7
Adaptation 79.2
Crowd 77.6
Monochrome 75.8
Monochrome photography 75.5
Sun hat 74.5
Snapshot 74.3
Art 73.6
Event 71.5
Illustration 69.6
Room 67.4
Stock photography 66.8
History 63.7
Cap 63.4

Microsoft
created on 2022-01-23

text 99.3
person 96.2
black and white 80.4
people 66.9
man 65.4
white 63
skyscraper 61.8
old 50.8
posing 35.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 98.8%
Calm 97.3%
Angry 0.8%
Sad 0.7%
Surprised 0.5%
Happy 0.3%
Fear 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 22-30
Gender Male, 94.3%
Sad 36.7%
Disgusted 27.6%
Happy 17.3%
Calm 6.5%
Fear 5.3%
Confused 3.5%
Angry 1.6%
Surprised 1.5%

AWS Rekognition

Age 30-40
Gender Male, 99.1%
Calm 97%
Happy 1.3%
Disgusted 0.5%
Sad 0.4%
Angry 0.3%
Surprised 0.2%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Female, 77.8%
Calm 93.6%
Sad 2.9%
Angry 2%
Happy 0.6%
Surprised 0.3%
Fear 0.2%
Disgusted 0.2%
Confused 0.2%

AWS Rekognition

Age 27-37
Gender Male, 66.8%
Happy 53.2%
Calm 15.2%
Disgusted 9.3%
Sad 9.3%
Confused 6%
Angry 3.7%
Surprised 1.6%
Fear 1.6%

AWS Rekognition

Age 23-33
Gender Male, 77.3%
Sad 93.1%
Fear 3%
Calm 2.7%
Confused 0.4%
Angry 0.3%
Happy 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 27-37
Gender Male, 90.3%
Calm 46.8%
Disgusted 21.8%
Sad 15.6%
Confused 5.9%
Fear 4.7%
Happy 2.2%
Angry 2.1%
Surprised 0.9%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Calm 83.7%
Happy 3.3%
Confused 3%
Sad 2.5%
Fear 2.2%
Disgusted 2.1%
Surprised 1.9%
Angry 1.4%

Feature analysis

Amazon

Person 99.4%
Hat 51%

Text analysis

Amazon

19240.
38
192-40.

Google

19240 172
19240
172