Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4543.2

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4543.2

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Person 97.4
Person 97.2
Adult 96.5
Male 96.5
Man 96.5
Person 96.5
Person 94.9
Person 94.8
Person 94.5
Person 94.3
Person 94.3
Person 93.2
Art 92.8
Painting 92.8
Person 91.6
Person 91.1
Animal 91
Cattle 91
Livestock 91
Mammal 91
Person 89.5
Outdoors 88.8
Person 88.5
Head 85
Person 83.6
Person 82.1
Person 80.2
Face 78.7
Person 76.7
Person 75.7
Nature 69.6
Person 66.3
Person 63.3
Bull 56.3
Cow 56.2
Dairy Cow 56.2
Countryside 55.9
Rural 55.2

Clarifai
created on 2018-05-10

people 100
group 99.5
many 97
group together 96.6
adult 96.4
man 95.4
child 93.3
war 92.4
art 89.3
administration 88.9
several 88.3
wear 88.1
military 87.7
woman 85.1
engraving 84.9
soldier 83
illustration 81.9
home 79.2
print 78.4
monochrome 78

Imagga
created on 2023-10-05

old 23
prison 17.4
architecture 17.2
product 16
religion 14.3
building 14.1
city 14.1
vintage 14.1
correctional institution 14
newspaper 14
man 13.4
light 13.4
ancient 13
cell 12.8
grunge 11.9
creation 11.9
people 11.7
art 11.7
history 11.6
religious 11.2
black 10.8
work 10.6
travel 10.6
penal institution 10.5
wicker 10.4
antique 10.4
tourism 9.9
stone 9.7
dark 9.2
outdoors 9
person 8.6
seller 8.6
sky 8.3
street 8.3
historic 8.2
landmark 8.1
night 8
stall 7.8
historical 7.5
window 7.5
monument 7.5
structure 7.4
exterior 7.4
inside 7.4
water 7.3
metal 7.2
dirty 7.2
male 7.1
institution 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

old 52.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 52.2%
Fear 98%
Surprised 6.3%
Sad 2.2%
Calm 0.2%
Happy 0.2%
Angry 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 18-26
Gender Male, 73.8%
Calm 81.1%
Fear 10.4%
Surprised 6.4%
Angry 3.8%
Sad 3.6%
Happy 0.8%
Disgusted 0.5%
Confused 0.4%

AWS Rekognition

Age 18-26
Gender Female, 73.8%
Sad 100%
Fear 6.6%
Surprised 6.5%
Angry 3.9%
Calm 2.1%
Disgusted 1.1%
Happy 1%
Confused 0.4%

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%

Text analysis

Amazon

College
and
Art
Fellows
(Harvard
Museums)
of
Harvard
University
© President and Fellows of Harvard College (Harvard University Art Museums)
President
P1970.4543.0002
©

Google

© President and Fellows of Harvard College (Harvard University Art Museums) P1970.4543.0002
©
President
and
Fellows
of
Harvard
College
(
University
Art
Museums
)
P1970.4543.0002