Human Generated Data

Title

Untitled (woman playing piano in crowd)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4426

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano in crowd)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4426

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Audience 99.8
Crowd 99.8
Human 99.8
Person 90.5
Person 88.4
Person 87.5
Person 84.6
Person 83.1
Person 81.6
Nature 79.6
People 78
Person 76.6
Person 73.7
Person 71.8
Person 67.9
Person 63.3
Smoke 60.6
Person 55.6

Clarifai
created on 2023-10-26

people 99.8
many 99.6
group 98.3
crowd 98.1
adult 97.2
man 96.3
administration 89.1
leader 88.1
war 87.7
audience 87.5
group together 87.2
woman 85.8
desktop 85.8
chair 82.4
monochrome 81.9
education 74.3
music 73.1
wear 72.7
military 71.5
sitting 69.6

Imagga
created on 2022-01-23

cemetery 59.7
city 47.4
architecture 40.6
cityscape 32.2
urban 28.8
sky 27.4
building 27.4
travel 26.8
town 26
structure 22.1
buildings 20.8
billboard 20.6
skyline 20
tourism 19.8
landscape 19.3
panorama 18.1
landmark 18.1
aerial 17.5
sea 17.3
signboard 16.7
church 16.7
tower 16.1
old 16
river 16
scene 15.6
daily 15.5
negative 14.7
stone 14.4
panoramic 14.4
skyscraper 13.7
coast 13.5
history 13.4
famous 13
film 12.7
houses 12.6
tourist 12.6
water 12
summer 11.6
port 11.6
downtown 11.5
cathedral 11.5
bridge 11.5
scenic 11.4
sun 11.3
scenery 10.8
boat 10.2
sunset 9.9
religion 9.9
vacation 9.8
mountain 9.8
roof 9.7
newspaper 9.5
cloud 9.5
culture 9.4
monument 9.3
clouds 9.3
ocean 9.1
light 8.7
ancient 8.7
day 8.6
modern 8.4
photographic paper 8.3
exterior 8.3
street 8.3
historic 8.3
park 8.2
outdoors 8.2
horizon 8.1
tree 7.7
village 7.7
capital 7.6
bay 7.6
destination 7.5
shore 7.4
snow 7.4
night 7.1
country 7
center 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.3
black and white 89.6
person 86
black 84.2
skyscraper 80.2
sky 76.1
people 69.6
old 65.7
white 63.1
crowd 25.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 99.8%
Calm 99.9%
Disgusted 0%
Happy 0%
Sad 0%
Surprised 0%
Confused 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 20-28
Gender Female, 89.6%
Calm 49.3%
Sad 35%
Confused 6.8%
Fear 2.5%
Angry 2.2%
Disgusted 1.8%
Surprised 1.4%
Happy 1%

AWS Rekognition

Age 16-22
Gender Male, 83.9%
Calm 98.3%
Disgusted 0.6%
Sad 0.4%
Happy 0.3%
Angry 0.3%
Fear 0.1%
Surprised 0%
Confused 0%

AWS Rekognition

Age 23-31
Gender Female, 98.2%
Calm 58.5%
Sad 39.4%
Confused 0.6%
Angry 0.5%
Disgusted 0.4%
Fear 0.2%
Surprised 0.2%
Happy 0.2%

AWS Rekognition

Age 18-24
Gender Female, 56.8%
Calm 67.2%
Happy 9.1%
Confused 8.6%
Sad 6%
Fear 3.1%
Surprised 2.4%
Angry 1.9%
Disgusted 1.7%

AWS Rekognition

Age 7-17
Gender Female, 73.3%
Sad 46.8%
Calm 26.7%
Fear 8.4%
Happy 8.1%
Surprised 4.7%
Angry 2.3%
Confused 1.7%
Disgusted 1.4%

AWS Rekognition

Age 26-36
Gender Male, 95.5%
Calm 91.2%
Sad 3.8%
Fear 1.8%
Surprised 1%
Disgusted 0.7%
Confused 0.7%
Angry 0.5%
Happy 0.3%

AWS Rekognition

Age 28-38
Gender Male, 99.7%
Calm 92.7%
Surprised 4.7%
Sad 1.1%
Confused 0.5%
Disgusted 0.4%
Angry 0.3%
Happy 0.3%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Female, 64.2%
Calm 67.7%
Fear 28.2%
Sad 2%
Happy 0.6%
Disgusted 0.4%
Surprised 0.4%
Confused 0.4%
Angry 0.3%

Feature analysis

Amazon

Person 90.5%

Text analysis

Amazon

17239.

Google

17239.
17239.