Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4528.1

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4528.1

Machine Generated Data

Tags

Amazon
created on 2023-10-06

People 98.4
Person 98.3
Male 95.8
Man 95.8
Person 95.8
Adult 95.8
Person 95
Person 93.6
Male 93.3
Man 93.3
Person 93.3
Adult 93.3
Person 91.5
Person 89.4
Person 88.5
Adult 88.3
Bride 88.3
Female 88.3
Wedding 88.3
Woman 88.3
Person 88.3
Person 83.7
Person 82.7
Face 75.9
Head 75.9
Adult 74
Bride 74
Female 74
Woman 74
Person 74
Dancing 70
Leisure Activities 70
Accessories 69.5
Bag 69.5
Handbag 69.5
Person 59.6
Clothing 59.3
Footwear 59.3
Shoe 59.3
Stilts 56.3
Festival 55.6
Parade 55

Clarifai
created on 2018-05-10

people 100
adult 97.9
man 97.3
group 96.7
many 96.5
group together 96.3
monochrome 94.9
wear 93.7
military 92.6
print 90.3
weapon 90.3
war 89
skirmish 87.6
illustration 87
dancing 86.3
several 86.3
music 85.1
woman 84.9
engraving 84.5
art 84.3

Imagga
created on 2023-10-06

sketch 83.6
drawing 73.1
representation 49.4
grunge 21.3
umbrella 18.3
man 16.1
snow 15.8
art 15.6
black 15
dirty 13.5
old 13.2
pattern 12.3
decoration 12
canopy 11.7
dark 11.7
design 11.2
winter 11.1
person 10.7
people 10
silhouette 9.9
vintage 9.9
texture 9.7
body 9.6
tree 9.3
weather 9.1
painting 9
landscape 8.9
shelter 8.7
antique 8.6
men 8.6
old fashioned 8.6
frame 8.3
human 8.2
outdoors 8.2
style 8.2
history 8
hair 7.9
male 7.8
ancient 7.8
wall 7.7
city 7.5
retro 7.4
paint 7.2
paper 7.1
architecture 7
textured 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

text 97.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Male, 96.6%
Happy 82.3%
Surprised 11.6%
Fear 6.1%
Calm 3.7%
Angry 2.9%
Sad 2.3%
Disgusted 1.5%
Confused 0.6%

AWS Rekognition

Age 29-39
Gender Male, 95.3%
Calm 38.1%
Happy 31.6%
Sad 29.9%
Surprised 6.8%
Fear 6.5%
Confused 4.7%
Disgusted 1.5%
Angry 0.8%

AWS Rekognition

Age 28-38
Gender Female, 83.1%
Calm 98.7%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Disgusted 0.4%
Happy 0.1%
Angry 0.1%
Confused 0%

Feature analysis

Amazon

Person 98.3%
Male 95.8%
Man 95.8%
Adult 95.8%
Bride 88.3%
Female 88.3%
Woman 88.3%
Handbag 69.5%
Shoe 59.3%

Categories

Imagga

paintings art 96.8%
nature landscape 2.5%

Text analysis

Amazon

College
Art
and
Fellows
(Harvard
of
Museums)
Harvard
University
© President and Fellows of Harvard College (Harvard University Art Museums)
President
P1970.4528.0001
©

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4528.0001
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4528.0001