Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4529.2

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4529.2

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Child 98.6
Female 98.6
Girl 98.6
Person 98.6
Person 98.6
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Person 90.6
Bicycle 90.4
Transportation 90.4
Vehicle 90.4
Person 85.8
Person 75.7
Face 72.9
Head 72.9
Cycling 69.9
Sport 69.9
Clothing 66.9
Shorts 66.9
Plant 56.3
Vegetation 56.3
Back 56
Body Part 56

Clarifai
created on 2018-05-10

people 99.7
adult 98
man 97.6
war 92.8
group together 92.3
woman 91.6
two 90.6
one 88.6
monochrome 88.4
child 88.1
group 87.2
military 87
weapon 86.8
wear 85.7
gun 85.2
street 84.6
transportation system 84
three 81.2
skirmish 81.1
retro 79.5

Imagga
created on 2023-10-06

art 20.9
old 20.9
grunge 20.4
religion 18.8
sketch 18.2
drawing 18
architecture 17.3
sculpture 16
statue 15.4
building 14.6
ancient 13.8
stone 13.7
umbrella 13.5
history 13.4
vintage 13.2
dirty 12.6
shelter 12.3
religious 12.2
decoration 12
church 12
travel 12
design 11.9
tourism 11.5
antique 11.4
historical 11.3
canopy 11
black 10.9
representation 10.8
texture 10.4
style 10.4
culture 10.3
cemetery 10.1
structure 10
city 10
spirituality 9.6
historic 9.2
landmark 9
protective covering 9
detail 8.8
memorial 8.5
carving 8.4
monument 8.4
pattern 8.2
aged 8.1
wall 7.8
golden 7.7
fence 7.7
faith 7.6
old fashioned 7.6
decorative 7.5
tree 7.4
facade 7.4
ornate 7.3
holiday 7.2
window 7.1
face 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 98.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Male, 99.7%
Happy 64%
Calm 30.8%
Fear 6.4%
Surprised 6.4%
Sad 2.6%
Confused 0.9%
Angry 0.8%
Disgusted 0.5%

Feature analysis

Amazon

Child 98.6%
Female 98.6%
Girl 98.6%
Person 98.6%
Adult 98.4%
Male 98.4%
Man 98.4%

Categories

Text analysis

Amazon

of
College
and
Art
(Harvard
Fellows
Museums)
Harvard
President
© President and Fellows of Harvard College (Harvard University Art Museums)
University
P1970.4529.0002
©

Google

C President and Fellows of Harvard College (Harvard University Art Museums) P1970.4529.0002
C
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4529.0002