Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5034

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5034

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Back 99.7
Body Part 99.7
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 97.9
Adult 97.9
Male 97.9
Man 97.9
Person 97.8
Adult 97.8
Male 97.8
Man 97.8
Person 97
Person 96.8
Adult 96.8
Male 96.8
Man 96.8
Person 96.5
Person 96.5
Adult 96.5
Male 96.5
Man 96.5
Person 96.3
Clothing 95.2
Person 95
Adult 95
Bride 95
Female 95
Wedding 95
Woman 95
Person 95
Person 94.5
Person 94.4
Adult 94.4
Male 94.4
Man 94.4
Animal 93
Bull 93
Mammal 93
Person 91.9
Person 91.5
Adult 91.5
Male 91.5
Man 91.5
Person 91.5
Adult 91.5
Male 91.5
Man 91.5
Person 91
Person 89.6
Face 89.3
Head 89.3
Person 84.9
Person 83.1
Sun Hat 82.9
Person 81.2
Outdoors 79.1
Architecture 77.1
Building 77.1
Shelter 77.1
Person 75.4
Hat 72.1
Person 69
Person 66.8
Person 58
Crowd 57
Cattle 55.9
Livestock 55.9
Ox 55.9
Nature 55.2

Clarifai
created on 2018-05-10

people 100
group 99
child 97.6
group together 96.8
many 96.6
adult 96
man 95.9
woman 95.1
crowd 91.6
wear 85.7
furniture 85.3
sit 84.3
monochrome 83.9
several 83.4
street 82.5
boy 81.6
war 79.7
recreation 78.9
military 78.7
vehicle 77.8

Imagga
created on 2023-10-06

percussion instrument 26.1
steel drum 20.4
musical instrument 20.1
seller 19.6
religion 17.9
stall 16.8
old 16
winter 13.6
snow 13.4
architecture 13.3
city 12.5
person 12.4
people 12.3
history 11.6
building 11.3
travel 11.3
cold 11.2
vintage 10.7
outdoor 10.7
statue 10.6
religious 10.3
sky 10.2
man 10.1
water 10
world 9.9
temple 9.7
black 9.6
old fashioned 9.5
culture 9.4
church 9.2
tree 9.2
traditional 9.1
adult 9.1
landscape 8.9
couple 8.7
art 8.5
holiday 7.9
color 7.8
park 7.8
portrait 7.8
grunge 7.7
sculpture 7.6
vacation 7.4
dress 7.2
river 7.1
trees 7.1
day 7.1
scenic 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 93.7
people 72.8
old 62.7
crowd 29.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Female, 95.7%
Happy 98.9%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Calm 0.4%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.8%
Calm 94.1%
Surprised 6.6%
Fear 5.9%
Confused 3.3%
Sad 2.3%
Angry 1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 19-27
Gender Female, 67.3%
Fear 95.5%
Surprised 6.6%
Sad 5.3%
Calm 4.1%
Disgusted 0.6%
Angry 0.4%
Happy 0.4%
Confused 0.3%

AWS Rekognition

Age 19-27
Gender Male, 66.7%
Calm 74.7%
Surprised 8.3%
Happy 7.4%
Sad 6.4%
Fear 6.2%
Confused 2.4%
Angry 1.7%
Disgusted 1.4%

AWS Rekognition

Age 21-29
Gender Male, 100%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.6%
Happy 0.1%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 6-12
Gender Male, 50.2%
Happy 50.4%
Surprised 19.6%
Calm 10.1%
Sad 9.8%
Fear 8.4%
Disgusted 2.8%
Confused 2.8%
Angry 1.8%

Feature analysis

Amazon

Person 98.2%
Adult 98.2%
Male 98.2%
Man 98.2%
Bride 95%
Female 95%
Woman 95%
Hat 72.1%