Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4532.4

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4532.4

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 98.3
Person 98.3
Adult 98.3
Female 98.3
Woman 98.3
Person 98.3
Person 98.2
Person 97.8
Adult 97.8
Male 97.8
Man 97.8
Clothing 94
Shorts 94
Person 91.3
Sword 86.6
Weapon 86.6
Person 86.1
Footwear 85.3
Shoe 85.3
Person 85.2
Shoe 83.3
People 76
Face 73
Head 73
Shoe 65.7
Helmet 63.9
Back 58
Body Part 58
Hat 56.5
Armor 55.9
Art 55.1
Drawing 55.1

Clarifai
created on 2018-05-10

people 99.8
group 98.8
group together 98.2
many 96.9
adult 96.2
wear 95.9
street 95.7
man 95.3
dancer 95.2
dancing 94.9
costume 93.8
woman 93
monochrome 92.9
child 91
crowd 90.6
music 90.1
several 86.6
veil 81.5
culture 81.3
art 81.3

Imagga
created on 2023-10-06

people 17.8
man 17.5
dress 17.2
clothing 16.7
person 15.3
costume 15.2
pedestrian 14.8
celebration 13.5
umbrella 12.8
art 12.6
sport 12.5
traditional 12.5
festival 12.4
portrait 12.3
adult 12.3
male 12
face 11.4
happy 11.3
fun 11.2
covering 11
seller 11
colorful 10.7
grunge 10.2
mask 10.1
lifestyle 10.1
outdoor 9.9
fashion 9.8
carnival 9.7
men 9.4
tradition 9.2
makeup 9.1
black 9
style 8.9
color 8.9
crutch 8.9
look 8.8
dance 8.7
happiness 8.6
holiday 8.6
culture 8.5
expression 8.5
entertainment 8.3
human 8.2
park 8.2
brassiere 8.2
active 8.1
romantic 8
garment 7.9
model 7.8
mysterious 7.8
horse 7.7
party 7.7
hat 7.6
parasol 7.5
dark 7.5
city 7.5
lady 7.3
paint 7.2
canopy 7.2
love 7.1
travel 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 87.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Female, 89.7%
Sad 61.1%
Calm 31.4%
Happy 18.3%
Fear 11.4%
Surprised 7.2%
Confused 3.5%
Angry 3.1%
Disgusted 1.3%

AWS Rekognition

Age 23-31
Gender Male, 96.3%
Calm 46.8%
Disgusted 35.9%
Sad 8%
Fear 7.5%
Surprised 6.8%
Confused 1.1%
Angry 1.1%
Happy 0.6%

Feature analysis

Amazon

Person 98.3%
Adult 98.3%
Female 98.3%
Woman 98.3%
Male 97.8%
Man 97.8%
Shoe 85.3%
Helmet 63.9%

Categories

Imagga

paintings art 99.6%

Text analysis

Amazon

of
College
Art
and
(Harvard
Fellows
Museums)
Harvard
University
© President and Fellows of Harvard College (Harvard University Art Museums)
President
P1970.4532.0004
©

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4532.0004
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4532.0004