Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4529.4

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4529.4

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Fashion 99.9
Adult 98.9
Female 98.9
Person 98.9
Woman 98.9
Adult 98.8
Person 98.8
Male 98.8
Man 98.8
Clothing 98.8
Dress 98.8
Person 98.6
Formal Wear 98
Gown 98
Person 97.1
People 97
Hat 95
Person 91.3
Person 86
Face 79.9
Head 79.9
Person 77.3
Person 75.8
Bag 74.4
Adult 70.6
Female 70.6
Person 70.6
Woman 70.6
Bride 70.6
Wedding 70.6
Person 69.3
Person 69.1
Person 65.2
Coat 57.9
Stilts 56.8
Back 56.7
Body Part 56.7
Walking 56.3
Dancing 56.1
Leisure Activities 56.1
Outdoors 55.8
Costume 55.6
Plant 55.5
Tree 55.5

Clarifai
created on 2018-05-10

people 99.9
group 98.4
adult 97.9
group together 97.2
many 95.8
man 95.1
wear 94.1
monochrome 93.5
child 93.5
woman 92.2
war 83
several 82.1
street 80.9
dancing 80.7
veil 80.6
military 79.9
crowd 79.6
outfit 78.1
music 76.4
dancer 72.2

Imagga
created on 2023-10-06

crutch 52.4
staff 41.6
stick 31.6
man 24.2
fountain 22
people 20.1
person 18.3
black 14.4
male 14.2
structure 14
world 13.8
dirty 13.6
adult 13
sport 12.7
silhouette 12.4
danger 11.8
dark 11.7
city 11.6
protection 10.9
power 10.9
sunset 10.8
outdoors 10.7
outdoor 10.7
water 10.7
statue 10.6
travel 10.6
standing 10.4
building 10.3
weapon 9.8
human 9.7
military 9.7
couple 9.6
industrial 9.1
old 9.1
clothing 8.9
toxic 8.8
urban 8.7
women 8.7
architecture 8.6
men 8.6
portrait 8.4
sky 8.3
light 8
radioactive 7.9
radiation 7.8
destruction 7.8
accident 7.8
mask 7.8
protective 7.8
nuclear 7.8
chemical 7.7
summer 7.7
gas 7.7
tree 7.7
stone 7.7
two 7.6
beach 7.6
fashion 7.5
style 7.4
park 7.4
vacation 7.4
street 7.4
religion 7.2
activity 7.2
wet 7.2
history 7.2
sunlight 7.1
posing 7.1
love 7.1
businessman 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 90.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-54
Gender Male, 76.7%
Calm 98%
Surprised 6.4%
Fear 6.2%
Sad 2.2%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 23-31
Gender Male, 95.2%
Surprised 57.6%
Calm 25.9%
Fear 19.7%
Angry 7.9%
Sad 3.4%
Disgusted 3.3%
Happy 2.1%
Confused 2.1%

AWS Rekognition

Age 23-33
Gender Male, 98.7%
Surprised 67%
Happy 40.8%
Calm 10.1%
Fear 6.1%
Angry 4.9%
Sad 2.5%
Confused 1.8%
Disgusted 1.2%

Feature analysis

Amazon

Adult 98.9%
Female 98.9%
Person 98.9%
Woman 98.9%
Male 98.8%
Man 98.8%
Bride 70.6%

Text analysis

Amazon

College
and
Art
(Harvard
Fellows
Museums)
of
Harvard
University
President
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4529.0004
©

Google

© President and Fellows of Harvard College (Harvard University Art Museums) P1970.4529.0004
©
President
and
Fellows
of
Harvard
College
(
University
Art
Museums
)
P1970.4529.0004