Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4530.1

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4530.1

Machine Generated Data

Tags

Amazon
created on 2023-10-07

People 99.7
Person 98.2
Adult 98.2
Bride 98.2
Female 98.2
Wedding 98.2
Woman 98.2
Person 97.3
Adult 97.3
Male 97.3
Man 97.3
Person 96.9
Adult 96.9
Male 96.9
Man 96.9
Person 96.6
Person 94.1
Person 91.1
Person 91.1
Photography 90.8
Clothing 87.3
Shorts 87.3
Person 86.4
Person 85.1
Person 82.7
Person 82
Male 82
Boy 82
Child 82
Face 79.7
Head 79.7
Sword 74.9
Weapon 74.9
Person 63.8
Portrait 63.3
Person 61.3
Outdoors 61
Group Performance 57.8
Dancing 57.3
Leisure Activities 57.3
Person 57.3
Stilts 55.8
Musical Instrument 55.8
Performer 55.6
Walking 55.4
Brass Section 55

Clarifai
created on 2018-05-10

people 100
many 99.5
group 99.2
group together 99
adult 97.5
crowd 96.5
woman 95.2
man 95.2
music 94.4
wear 93.1
administration 90.3
spectator 89.8
dancing 88.5
child 86.1
war 86
recreation 85.1
outfit 84.8
dancer 83.4
military 83.2
audience 82.9

Imagga
created on 2023-10-07

trombone 24.1
brass 22.9
wind instrument 17.9
old 16.7
black 14.5
musical instrument 13.9
grunge 11.1
crutch 10.9
people 10.6
landscape 10.4
snow 10.3
shovel 10.2
dark 10
dirty 9.9
texture 9.7
tool 9.4
wall 9.4
winter 9.4
city 9.1
outdoors 8.9
stone 8.8
man 8.7
water 8.7
art 8.6
staff 8.5
vintage 8.3
industrial 8.2
tree 8
travel 7.7
hand tool 7.6
pattern 7.5
wood 7.5
clothing 7.4
building 7.3
stick 7.3
detail 7.2
history 7.1
surface 7.1
architecture 7
sky 7
textured 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.3
outdoor 95.4
people 83.4
sport 67.6
crowd 54.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-51
Gender Female, 95.7%
Calm 42.2%
Happy 33.8%
Surprised 6.9%
Confused 6.7%
Sad 6.7%
Fear 6.4%
Angry 4.1%
Disgusted 2.3%

AWS Rekognition

Age 33-41
Gender Female, 86%
Surprised 91.7%
Sad 14%
Fear 8.7%
Calm 6.3%
Happy 6.1%
Confused 3.4%
Angry 3%
Disgusted 1.6%

AWS Rekognition

Age 22-30
Gender Female, 78.2%
Calm 74%
Confused 8.7%
Angry 8.5%
Surprised 7.9%
Fear 6%
Sad 3.6%
Disgusted 1.1%
Happy 0.5%

AWS Rekognition

Age 23-31
Gender Male, 96.3%
Confused 31.3%
Fear 27.3%
Calm 15.3%
Surprised 12.4%
Sad 6.8%
Angry 6.7%
Disgusted 1.8%
Happy 0.9%

AWS Rekognition

Age 16-22
Gender Male, 59.6%
Sad 82.6%
Calm 54.6%
Surprised 6.8%
Fear 6%
Happy 2.3%
Angry 1.7%
Disgusted 1.4%
Confused 1.1%

AWS Rekognition

Age 22-30
Gender Male, 83.9%
Calm 98.5%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%
Confused 0%

AWS Rekognition

Age 21-29
Gender Female, 54%
Happy 39.9%
Sad 23.3%
Angry 15.1%
Calm 14%
Fear 9%
Surprised 8%
Disgusted 2.1%
Confused 0.6%

Feature analysis

Amazon

Person 98.2%
Adult 98.2%
Bride 98.2%
Female 98.2%
Woman 98.2%
Male 97.3%
Man 97.3%
Boy 82%
Child 82%

Categories

Imagga

nature landscape 94.6%
paintings art 4.1%

Text analysis

Amazon

College
Art
and
Fellows
(Harvard
Museums)
of
University
Harvard
President
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4530.0001
©

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4530.0001
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4530.0001