Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4544.6

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4544.6

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Person 97.3
Person 97.3
Person 96.8
Adult 96.8
Female 96.8
Woman 96.8
Person 96.1
Person 94.6
Person 92.2
Person 91.7
Person 89
Person 85.8
People 85.6
Person 83.7
Person 83.3
Person 82.6
Outdoors 81.6
Person 79.9
Person 76.9
Person 76.9
Person 76.5
Person 76.3
Person 75.1
Tartan 70.1
Person 66.1
Nature 65.3
Head 61.6
Clothing 56.7
Skirt 56.7
Fashion 56.1
Night 55.9
Shorts 55.1

Clarifai
created on 2018-05-10

people 99.6
adult 96.6
war 96.4
wear 95.5
man 94.8
group 94.4
military 94.3
group together 93.4
woman 89.5
soldier 88
monochrome 87.4
vintage 85.3
weapon 83.4
retro 82.6
many 82.5
mammal 82.2
uniform 82.1
skirmish 80.8
old 80.1
transportation system 79.1

Imagga
created on 2023-10-05

tunnel 24.4
dark 23.4
old 19.5
light 19.4
cell 17.8
man 16.1
passageway 15.1
industrial 14.5
dirty 14.5
wall 14
stretcher 13.4
night 13.3
canvas tent 13.2
stone 12.9
person 11.8
passage 11.5
construction 11.1
architecture 10.9
litter 10.9
mine 10.7
adult 10.3
people 10
danger 10
cadaver 9.9
building 9.7
cave 9.4
grunge 9.4
water 9.3
silhouette 9.1
conveyance 9
mask 8.9
autumn 8.8
horror 8.7
way 8.7
industry 8.5
tree 8.5
smoke 8.4
city 8.3
sky 8.3
safety 8.3
protection 8.2
landscape 8.2
factory 7.8
destruction 7.8
spooky 7.8
scene 7.8
fear 7.7
mystery 7.7
power 7.6
house 7.5
work 7.4
inside 7.4
sun 7.2
art 7.2
black 7.2
structure 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 86.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Female, 95.1%
Happy 96.1%
Surprised 6.3%
Fear 6.2%
Sad 2.5%
Angry 0.9%
Calm 0.5%
Disgusted 0.2%
Confused 0.1%

Feature analysis

Amazon

Person 97.3%
Adult 96.8%
Female 96.8%
Woman 96.8%

Text analysis

Amazon

College
Art
and
(Harvard
Fellows
Museums)
of
University
Harvard
© President and Fellows of Harvard College (Harvard University Art Museums)
President
P1970.4544.0006
©

Google

© President and Fellows of Harvard College (Harvard University Art Museums) P1970.4544.0006
©
President
and
Fellows
of
Harvard
College
(
University
Art
Museums
)
P1970.4544.0006