Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4530.6

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4530.6

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 99.4
Hat 99.4
Person 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 97.9
Adult 97.9
Male 97.9
Man 97.9
People 97.2
Person 95.8
Adult 95.8
Male 95.8
Man 95.8
Face 94.3
Head 94.3
Cap 87.3
Person 86.5
Person 86.4
Adult 86.4
Bride 86.4
Female 86.4
Wedding 86.4
Woman 86.4
Architecture 77.7
Building 77.7
Hospital 77.7
Person 73.8
Clinic 73.4
Footwear 63.9
Shoe 63.9
Turban 62.1
Outdoors 60
Bazaar 56.6
Market 56.6
Shop 56.6
Baseball Cap 55.8
Photography 55.3
Factory 55.3
Manufacturing 55.3

Clarifai
created on 2018-05-10

people 99.9
adult 98.7
group 97.8
man 97.2
two 94.9
woman 93.5
vehicle 92.4
monochrome 92.1
war 91.2
group together 90
four 90
wear 89.5
street 88.8
three 88.6
administration 87.7
child 87.7
military 86.6
one 86.6
several 86.5
sit 84.3

Imagga
created on 2023-10-07

seller 67.4
person 18.8
man 17.5
people 17.3
mask 16.9
dress 14.4
uniform 14.4
newspaper 13.6
outdoors 12.7
adult 12.4
product 12.2
traditional 11.6
clothing 11.4
portrait 11
bride 10.5
old 10.4
work 10.3
stone 10.2
bag 9.6
worker 9.5
face 9.2
wedding 9.2
plastic bag 8.9
costume 8.7
love 8.7
creation 8.6
sculpture 8.6
male 8.6
men 8.6
tradition 8.3
happy 8.1
building 7.9
art 7.8
statue 7.7
culture 7.7
house 7.5
human 7.5
equipment 7.3
protection 7.3
religion 7.2
activity 7.2
to 7.1
job 7.1
happiness 7
travel 7
architecture 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.4
outdoor 92.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 92.7%
Sad 99.4%
Calm 33.6%
Surprised 6.7%
Fear 6%
Confused 1.9%
Angry 0.5%
Disgusted 0.4%
Happy 0.2%

Feature analysis

Amazon

Person 99.1%
Adult 99.1%
Male 99.1%
Man 99.1%
Bride 86.4%
Female 86.4%
Woman 86.4%
Shoe 63.9%

Categories

Imagga

paintings art 99.4%

Captions

Text analysis

Amazon

College
Art
and
(Harvard
Fellows
of
Museums)
Harvard
University
President
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4530.0006
©

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4530.0006
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4530.0006