Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4544.1

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4544.1

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Adult 97.6
Male 97.6
Man 97.6
Person 97.6
Person 97.4
Adult 95.1
Male 95.1
Man 95.1
Person 95.1
Person 94.9
Person 91.8
Person 89.3
War 87.7
Face 82
Head 82
Person 79.3
People 78
Person 73.2
Outdoors 65.2
Person 64.4
Mining 57.3
Worker 57.2
Archaeology 57.2
Carpenter 55.8
Back 55.5
Body Part 55.5
Clothing 55.4
Shorts 55.4

Clarifai
created on 2018-05-10

people 99.9
group 98.3
adult 98
woman 96.6
child 96.2
man 96.1
many 92.9
group together 92.7
wear 91.7
indoors 91.2
war 90.5
room 87.9
several 86.9
sit 83.5
military 82.8
family 82.6
administration 80.4
recreation 77.8
furniture 73.9
outfit 73.9

Imagga
created on 2023-10-06

cadaver 29
tunnel 27.9
old 27.9
architecture 21.1
ancient 19
passageway 18.6
stone 18.4
religion 17.9
statue 17.1
sculpture 16.4
art 16.3
passage 15.8
building 14.3
vintage 14.1
culture 12.8
man 12.1
wall 12
travel 12
grunge 11.9
temple 11.8
history 11.6
newspaper 11.3
antique 11.3
religious 11.2
way 10.8
product 10.4
historical 10.4
dark 10
traditional 10
spiritual 9.6
god 9.6
scene 9.5
light 9.4
monument 9.3
historic 9.2
city 9.1
tourism 9.1
column 8.6
black 8.4
inside 8.3
person 8.1
dirty 8.1
creation 7.9
holiday 7.9
artistic 7.8
people 7.8
texture 7.6
famous 7.4
tourist 7.4
cave 7.4
aged 7.2
mine 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

old 60.9
group 57.1
clothes 21.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Male, 98.1%
Calm 76.3%
Surprised 7.6%
Disgusted 7.2%
Fear 6.2%
Confused 5.3%
Sad 3.7%
Happy 2.6%
Angry 1.3%

AWS Rekognition

Age 37-45
Gender Male, 95.5%
Happy 71.1%
Calm 11.3%
Surprised 7.3%
Sad 6.9%
Fear 6.2%
Disgusted 3.6%
Angry 1.5%
Confused 1.1%

AWS Rekognition

Age 22-30
Gender Female, 65.8%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Disgusted 0.2%
Happy 0.2%
Confused 0.1%

Feature analysis

Amazon

Adult 98.8%
Male 98.8%
Man 98.8%
Person 98.8%

Categories

Text analysis

Amazon

University
College
and
Art
Museums)
(Harvard
Fellows
of
Harvard
President
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4544.0001
©

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4544.0001
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4544.0001