Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5033

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5033

Machine Generated Data

Tags

Amazon
created on 2023-10-06

War 99.4
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Person 97.9
Adult 97.3
Male 97.3
Man 97.3
Person 97.3
Adult 97.3
Male 97.3
Man 97.3
Person 97.3
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Adult 97
Male 97
Man 97
Person 97
Male 96.9
Person 96.9
Boy 96.9
Child 96.9
Person 95.9
Child 95.9
Female 95.9
Girl 95.9
People 94.2
Person 92.9
Person 89.9
Person 77
Person 75.3
Person 71.8
Person 71.5
Head 69
Person 65.6
Person 65
Person 64
Person 63.1
Face 62.2
Person 60.7
Slum 57.2
Back 57
Body Part 57

Clarifai
created on 2018-05-10

people 99.9
group 99.4
many 98.7
group together 98.5
crowd 97
adult 96.8
man 96.4
war 95.8
administration 94
soldier 89.1
military 88.8
woman 88.5
child 86.2
skirmish 81
boy 78.5
leader 74.1
queue 73.8
street 73.5
vehicle 70.3
wear 68.1

Imagga
created on 2023-10-06

farmer 39.5
field 25.1
farm 25
landscape 23.8
person 22
outdoor 21.4
sky 20.4
rural 20.3
grass 18.2
plow 16.8
engineer 16.5
travel 16.2
summer 16.1
agriculture 15.8
mountain 15.1
man 14.8
military uniform 13.3
farming 13.3
uniform 13.1
outside 12.8
people 12.8
tourism 12.4
country 12.3
outdoors 11.9
tool 11.4
cattle 11.1
male 10.6
cow 10.6
group 10.5
natural 10
vacation 9.8
plant 9.5
men 9.4
animal 9.4
ranch 9.4
tree 9.4
active 9.1
old 9.1
park 9.1
scenery 9
meadow 9
scenic 8.8
forest 8.7
military 8.7
clouds 8.4
hill 8.4
environment 8.2
countryside 8.2
brown 8.1
herd 7.8
boy 7.8
army 7.8
war 7.8
scene 7.8
farmland 7.7
horse 7.6
walking 7.6
animals 7.4
clothing 7.4
building 7.3
hay 7.3
sun 7.2
cowboy 7.2
sea 7
architecture 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 96.1
group 86.9
standing 80.9
people 60.2
old 59.4
posing 41.4
clothes 17.8
crowd 0.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Female, 60.3%
Calm 63.6%
Sad 48%
Surprised 7.3%
Fear 6.3%
Disgusted 2.3%
Angry 2.2%
Happy 2.1%
Confused 0.6%

AWS Rekognition

Age 12-20
Gender Male, 88.3%
Sad 99.1%
Happy 25.7%
Surprised 8.4%
Fear 6.2%
Calm 5.7%
Angry 2.6%
Confused 1.4%
Disgusted 0.8%

AWS Rekognition

Age 23-31
Gender Male, 85.1%
Calm 33.6%
Surprised 18.2%
Fear 17%
Happy 7.4%
Angry 7%
Disgusted 6.6%
Sad 6.3%
Confused 5.9%

AWS Rekognition

Age 23-33
Gender Female, 90.2%
Calm 45.8%
Sad 19.6%
Fear 17.8%
Happy 11.2%
Surprised 6.9%
Angry 3.8%
Disgusted 1.7%
Confused 1%

AWS Rekognition

Age 40-48
Gender Male, 99.6%
Calm 45.7%
Disgusted 28.1%
Sad 12%
Confused 7.1%
Surprised 7%
Fear 6.2%
Angry 2.6%
Happy 1.4%

AWS Rekognition

Age 23-33
Gender Female, 51.4%
Fear 55.7%
Happy 12.9%
Confused 11.7%
Sad 9.7%
Calm 8.6%
Surprised 7.5%
Angry 5.9%
Disgusted 2.9%

Feature analysis

Amazon

Adult 98.5%
Male 98.5%
Man 98.5%
Person 98.5%
Boy 96.9%
Child 96.9%
Female 95.9%
Girl 95.9%

Categories

Imagga

paintings art 98.1%