Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2396

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2396

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Person 98.3
Adult 97.6
Male 97.6
Man 97.6
Person 97.6
Adult 97.2
Male 97.2
Man 97.2
Person 97.2
Adult 96.2
Male 96.2
Man 96.2
Person 96.2
Person 96
Person 94.9
War 93.5
Adult 92.6
Male 92.6
Man 92.6
Person 92.6
Person 90.8
Person 89.7
Person 89.3
People 88.2
Face 80.6
Head 80.6
Person 71.9
Outdoors 62.7
Smoke 56.6
Fire 55.1
Slum 55.1

Clarifai
created on 2018-05-10

people 99.9
group 99.1
group together 98.3
military 98
many 97.9
adult 97.7
war 97.7
administration 96.2
soldier 95.4
man 95.1
uniform 92
skirmish 91.8
police 89.8
woman 88.7
wear 86.7
vehicle 86.1
child 86
crowd 85.3
weapon 82.4
gun 81.9

Imagga
created on 2023-10-06

man 24.2
kin 17.4
passenger 17.1
person 16.4
people 16.2
male 15.1
black 15
adult 13
silhouette 12.4
world 12.1
men 12
old 11.1
human 10.5
light 10.1
danger 10
city 10
spectator 9.9
child 9.8
grunge 9.4
building 9.2
portrait 9.1
love 8.7
house 8.4
dark 8.3
street 8.3
dirty 8.1
life 8
family 8
architecture 7.8
art 7.8
outdoor 7.6
sport 7.6
statue 7.6
leisure 7.5
smoke 7.4
sexy 7.2
looking 7.2
sunset 7.2
mask 7.2
groom 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 95.3
people 86.8
outdoor 86.2
group 83.6
standing 81.9
crowd 2.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Female, 50.4%
Calm 56.9%
Angry 29.3%
Surprised 10.9%
Fear 6.2%
Sad 2.7%
Confused 2.5%
Disgusted 1%
Happy 0.9%

AWS Rekognition

Age 16-24
Gender Male, 76.2%
Surprised 83.1%
Happy 40.7%
Fear 6%
Calm 4.9%
Sad 2.4%
Angry 1.7%
Disgusted 0.9%
Confused 0.7%

AWS Rekognition

Age 25-35
Gender Male, 99.4%
Sad 100%
Surprised 6.5%
Fear 5.9%
Confused 3.2%
Calm 2.3%
Angry 0.7%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 39-47
Gender Male, 98.7%
Calm 88%
Surprised 6.7%
Fear 5.9%
Confused 4%
Disgusted 3.8%
Sad 2.8%
Happy 0.7%
Angry 0.6%

AWS Rekognition

Age 20-28
Gender Male, 99.1%
Calm 52.9%
Sad 36.3%
Confused 15.8%
Surprised 8.9%
Fear 6.8%
Angry 0.8%
Disgusted 0.6%
Happy 0.3%

AWS Rekognition

Age 23-33
Gender Male, 62.9%
Calm 97.4%
Surprised 6.3%
Fear 5.9%
Sad 2.9%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 23-31
Gender Female, 66.3%
Sad 70.7%
Calm 39.5%
Surprised 12.4%
Disgusted 6.4%
Fear 6.1%
Angry 5.1%
Confused 5%
Happy 1.6%

AWS Rekognition

Age 23-33
Gender Male, 99.5%
Sad 85.2%
Calm 50.8%
Fear 6.8%
Surprised 6.7%
Confused 4.2%
Angry 1.1%
Disgusted 1%
Happy 0.6%

AWS Rekognition

Age 16-24
Gender Female, 78.4%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 1.5%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%