Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5191

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5191

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 97.9
Person 97.8
Adult 97.8
Female 97.8
Woman 97.8
Person 97.5
Adult 97.5
Male 97.5
Man 97.5
Person 97.5
Adult 97.5
Male 97.5
Man 97.5
Architecture 97.1
Building 97.1
Outdoors 97.1
Shelter 97.1
Person 97
Adult 97
Female 97
Woman 97
Bride 97
Wedding 97
Person 96.7
Person 96
Adult 96
Male 96
Man 96
Person 95.9
Person 95.8
Person 95
Face 93.6
Head 93.6
Person 90.2
Person 90.2
Person 89.6
Person 89.4
Person 79.9
Adult 79.9
Female 79.9
Woman 79.9
Bride 79.9
Person 79.4
Person 73.8
Person 73
Person 71
Art 68.3
Painting 68.3
Person 64.8
Animal 57.5
Mammal 57.5
Countryside 56.9
Hut 56.9
Nature 56.9
Rural 56.9
Amusement Park 56.5
Carousel 56.5
Play 56.5
Butcher Shop 55.8
Shop 55.8
Person 55.8
Altar 55.1
Church 55.1
Prayer 55.1

Clarifai
created on 2018-05-10

people 100
group 99.7
many 99.4
group together 98.8
adult 98.5
man 97.7
child 96.2
crowd 91.4
woman 90.9
furniture 90.1
administration 89.9
military 89.4
several 88.9
recreation 88.4
music 88.2
wear 85
print 83.9
leader 83
home 82.4
sit 82

Imagga
created on 2023-10-06

fountain 100
structure 88.5
statue 49
sculpture 44.2
architecture 39.1
history 30.4
ancient 27.7
religion 26.9
travel 26.8
tourism 26.4
monument 26.2
old 25.8
culture 23.1
temple 22.5
building 22.4
stone 21.4
landmark 18.1
famous 17.7
art 17.6
marble 17.4
city 16.6
palace 14.5
god 14.4
historical 14.1
religious 14.1
water 14
cemetery 13.8
park 13.6
worship 13.5
traditional 13.3
tourist 12.1
historic 11.9
antique 11.3
sky 10.8
heritage 10.6
column 10.1
figure 9.1
vintage 9.1
bronze 8.8
symbol 8.8
spiritual 8.6
tree 8.5
house 8.4
church 8.3
decoration 8
mythology 7.9
facade 7.7
spirituality 7.7
snow 7.7
vacation 7.4
memorial 7.3
color 7.2
holiday 7.2
seller 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 93.9
old 88.4
people 87.2
group 71.2
black 68.9
crowd 0.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Male, 100%
Angry 68.9%
Confused 12.6%
Sad 7.6%
Calm 6.9%
Surprised 6.7%
Fear 6.2%
Disgusted 0.6%
Happy 0.2%

AWS Rekognition

Age 25-35
Gender Male, 88.5%
Sad 99%
Calm 31.5%
Surprised 7%
Fear 6.1%
Confused 2.5%
Happy 2.1%
Disgusted 1.7%
Angry 1.5%

AWS Rekognition

Age 23-33
Gender Male, 90.6%
Surprised 71.2%
Calm 18.3%
Fear 18.1%
Sad 13.5%
Disgusted 3.2%
Angry 1.7%
Happy 1.4%
Confused 0.9%

AWS Rekognition

Age 16-24
Gender Female, 58.8%
Happy 96.5%
Surprised 6.4%
Fear 5.9%
Sad 2.4%
Calm 1.5%
Confused 0.4%
Angry 0.3%
Disgusted 0.1%

AWS Rekognition

Age 9-17
Gender Female, 93.8%
Angry 77.2%
Happy 9.1%
Surprised 7.6%
Fear 6.5%
Sad 3.5%
Disgusted 2.8%
Calm 2.5%
Confused 1%

AWS Rekognition

Age 16-24
Gender Male, 59.1%
Calm 97.4%
Surprised 6.5%
Fear 6%
Sad 2.2%
Confused 1.3%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 21-29
Gender Male, 62.2%
Sad 99.2%
Calm 17.2%
Happy 17%
Surprised 6.9%
Fear 6%
Confused 1.4%
Disgusted 1%
Angry 0.9%

AWS Rekognition

Age 26-36
Gender Male, 95.5%
Calm 88.8%
Surprised 7.6%
Fear 6%
Sad 3.3%
Happy 2.3%
Angry 2.1%
Disgusted 0.5%
Confused 0.4%

AWS Rekognition

Age 24-34
Gender Male, 98.9%
Calm 89.2%
Happy 6.6%
Surprised 6.5%
Fear 6.3%
Sad 2.5%
Angry 0.7%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 20-28
Gender Male, 98.8%
Calm 47%
Sad 19.2%
Happy 13%
Confused 12.7%
Surprised 7.6%
Fear 6.9%
Angry 3.1%
Disgusted 2.7%

Microsoft Cognitive Services

Age 19
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Adult 97.8%
Female 97.8%
Woman 97.8%
Male 97.5%
Man 97.5%
Bride 97%