Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4528.4

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4528.4

Machine Generated Data

Tags

Amazon
created on 2023-10-05

People 98.3
Person 98
Adult 98
Male 98
Man 98
Person 97.4
Adult 97.4
Male 97.4
Man 97.4
Person 97
Adult 97
Bride 97
Female 97
Wedding 97
Woman 97
Person 96.6
Person 95.6
Person 94.4
Person 91.4
Person 90.4
Adult 90.4
Male 90.4
Man 90.4
Person 88.9
Person 84.3
Person 80.8
Face 80.6
Head 80.6
Person 76.3
Gun 70.4
Weapon 70.4
Swordfight 58.1
Sword 57.7
Samurai 55.4

Clarifai
created on 2018-05-10

people 99.9
adult 98.7
group 98.4
many 96.6
man 96.2
monochrome 95.6
group together 93.7
woman 93.4
child 92.9
war 92.5
military 91.2
wear 88.8
weapon 88.7
skirmish 87.5
illustration 87.5
print 83.9
music 83.9
several 83.2
crowd 82.6
recreation 82

Imagga
created on 2023-10-05

crutch 51
staff 40.7
stick 33
people 25.6
person 21.3
man 18.1
adult 16.2
fashion 15.1
winter 14.5
old 13.9
portrait 13.6
model 13.2
cold 12.9
looking 12.8
male 12.8
snow 12.4
ride 12.1
clothing 12
youth 11.9
parasol 11.7
human 11.2
mechanical device 11.2
attractive 10.5
hair 10.3
outside 10.3
carousel 10
outdoor 9.9
lady 9.7
one 9.7
outdoors 9.7
happy 9.4
standing 8.7
swing 8.7
holiday 8.6
sitting 8.6
wall 8.5
face 8.5
bowed stringed instrument 8.5
leisure 8.3
holding 8.2
sport 8.2
style 8.2
religion 8.1
cool 8
smiling 8
lifestyle 7.9
life 7.9
mechanism 7.9
couple 7.8
smile 7.8
culture 7.7
musical instrument 7.6
traditional 7.5
teenager 7.3
stringed instrument 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

text 89.9
person 89.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Female, 75.9%
Sad 64.9%
Calm 28.9%
Confused 25.7%
Surprised 7.3%
Fear 6.4%
Happy 4%
Angry 3.6%
Disgusted 3.5%

AWS Rekognition

Age 20-28
Gender Female, 83.3%
Calm 75.7%
Sad 17.8%
Surprised 6.6%
Fear 6.5%
Disgusted 1.7%
Happy 1.6%
Angry 1.4%
Confused 1.2%

Feature analysis

Amazon

Person 98%
Adult 98%
Male 98%
Man 98%
Bride 97%
Female 97%
Woman 97%
Gun 70.4%

Categories

Imagga

paintings art 89.2%
pets animals 10.1%

Text analysis

Amazon

College
and
Art
Fellows
(Harvard
Museums)
of
University
Harvard
President
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4528.0004
©

Google

© President and Fellows of Harvard College (Harvard University Art Museums) P1970.4528.0004
©
President
and
Fellows
of
Harvard
College
(
University
Art
Museums
)
P1970.4528.0004