Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5040

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5040

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Back 100
Body Part 100
Clothing 100
Person 98.9
Child 98.9
Female 98.9
Girl 98.9
Person 98.8
Child 98.8
Boy 98.8
Male 98.8
Person 98.5
Child 98.5
Boy 98.5
Male 98.5
Person 98.5
Male 98.5
Adult 98.5
Man 98.5
Person 98.1
Person 96.6
People 96.4
Person 96.4
Person 95.2
Person 93.9
Person 91.8
Person 90.2
Person 90.1
Person 88.2
Person 87.8
Person 87.4
Person 84.9
Male 84.9
Adult 84.9
Man 84.9
Person 84.4
Person 83.2
Person 80.5
Person 79.5
Person 79.2
Person 78.3
Head 73.8
Outdoors 66.5
Face 64.7
Shorts 64.5
Person 61.1
Person 60.7
Festival 57.1
Architecture 56
Building 56
Countryside 56
Hut 56
Nature 56
Rural 56
Smoke 55.3
Slum 55.2

Clarifai
created on 2018-05-10

people 99.9
child 99.5
group 98.8
monochrome 95.3
group together 95.1
boy 93.7
many 93.7
adult 92.6
man 91.9
son 91.3
family 87.5
wear 85.9
woman 85.4
several 84.5
street 80.8
offspring 80.2
baby 78.2
administration 76.5
war 76.2
facial expression 75.1

Imagga
created on 2023-10-06

statue 31.2
old 29.2
cemetery 27.9
sculpture 25.1
world 24.5
ancient 24.2
religion 22.4
history 22.3
architecture 21.9
stone 19.4
fountain 16.8
monument 15.9
tourism 15.7
gong 15.5
pillory 15.4
culture 15.4
temple 15.3
percussion instrument 14.6
city 14.1
travel 14.1
instrument of punishment 13.7
building 12.9
art 12.5
historic 11.9
vintage 11.6
instrument 11.5
historical 11.3
structure 11.3
religious 11.2
musical instrument 11
landmark 10.8
god 10.5
antique 10.4
water 10
marble 9.7
man 9.4
famous 9.3
traditional 9.1
people 8.9
child 8.9
park 8.7
holiday 8.6
church 8.3
device 8.1
urban 7.9
heritage 7.7
death 7.7
seller 7.7
sky 7.6
house 7.5
tree 7.4

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.1
outdoor 98.9
standing 82
group 73.7
posing 44.9
old 41.3
crowd 1.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Female, 73.1%
Calm 59.2%
Fear 15.2%
Happy 10%
Sad 7.2%
Surprised 6.6%
Angry 2.9%
Disgusted 1.6%
Confused 1%

AWS Rekognition

Age 6-16
Gender Male, 83.6%
Calm 98.4%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Angry 0.2%
Happy 0.2%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 29-39
Gender Male, 61.7%
Calm 42.5%
Happy 26.9%
Sad 24.8%
Fear 7.5%
Surprised 7%
Disgusted 3.1%
Confused 1.6%
Angry 1.6%

AWS Rekognition

Age 7-17
Gender Male, 93.5%
Surprised 21.9%
Fear 18.2%
Happy 17.9%
Sad 16.7%
Calm 9.1%
Confused 9%
Angry 6.8%
Disgusted 6.2%

AWS Rekognition

Age 24-34
Gender Male, 80.8%
Calm 58%
Happy 22.4%
Surprised 7.4%
Sad 6.7%
Fear 6.7%
Angry 4.1%
Confused 2%
Disgusted 1%

AWS Rekognition

Age 25-35
Gender Female, 51.2%
Calm 82.2%
Surprised 8.1%
Happy 6.4%
Fear 6.4%
Disgusted 3%
Sad 2.4%
Confused 1.8%
Angry 1.3%

AWS Rekognition

Age 31-41
Gender Male, 87.9%
Calm 96.8%
Surprised 6.3%
Fear 5.9%
Sad 2.8%
Happy 0.6%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%

Feature analysis

Amazon

Person 98.9%
Child 98.9%
Female 98.9%
Girl 98.9%
Boy 98.8%
Male 98.8%
Adult 98.5%
Man 98.5%
Shorts 64.5%

Categories