Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4528.5

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4528.5

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Shorts 100
People 99.9
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.5
Person 98.4
Dancing 95.9
Leisure Activities 95.9
Person 94.7
Adult 94.7
Male 94.7
Man 94.7
Person 94.1
Person 92.7
Person 90.9
Person 87.1
Person 85.1
Bicycle 79.5
Transportation 79.5
Vehicle 79.5
Person 78
Adult 78
Male 78
Man 78
Person 77.9
Head 76.3
Face 75.5
Machine 72.5
Wheel 72.5
Outdoors 67.2
Musical Instrument 65.1
Person 60.7
Skirt 57.9
Crowd 56.8
Photography 56.5
Back 56.3
Body Part 56.3
Nature 56.1
Hippie 55.1
Dance Pose 55

Clarifai
created on 2018-05-10

people 99.9
group together 98
group 97.7
man 96.8
monochrome 96
adult 95.8
many 95.5
wear 93
woman 92.4
crowd 88.5
several 86
street 85.8
music 85.4
child 84.5
administration 82.4
military 80.5
war 80.2
dancing 79.3
leader 78.7
ceremony 75.1

Imagga
created on 2023-10-07

astronaut 26.7
people 21.7
man 21.5
person 19.4
sport 17.5
male 16.3
private 14.9
statue 14.3
black 13.8
adult 13.7
men 12
travel 12
city 11.6
outdoor 11.5
old 11.1
crutch 11
architecture 10.9
lifestyle 10.8
cricket equipment 10.6
sculpture 10.5
sky 10.2
clothing 9.8
human 9.7
art 9.3
uniform 9.2
danger 9.1
exercise 9.1
world 9
history 8.9
sports equipment 8.6
walking 8.5
park 8.2
freedom 8.2
outdoors 8.2
active 8.1
activity 8.1
building 7.9
life 7.8
staff 7.7
stone 7.7
motion 7.7
athlete 7.6
historical 7.5
stick 7.5
action 7.4
religion 7.2
wicket 7.1
portrait 7.1
women 7.1
grass 7.1
day 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 93.3
text 86.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 83.2%
Calm 39%
Sad 22.9%
Fear 22.6%
Confused 7.3%
Surprised 7.2%
Happy 5.1%
Disgusted 3%
Angry 2.5%

AWS Rekognition

Age 29-39
Gender Female, 79%
Sad 40.8%
Confused 39.7%
Angry 10.6%
Calm 10.2%
Surprised 7.2%
Disgusted 6.8%
Happy 6.2%
Fear 6.1%

Feature analysis

Amazon

Person 98.9%
Adult 98.9%
Male 98.9%
Man 98.9%
Bicycle 79.5%
Wheel 72.5%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

College
and
Art
Fellows
(Harvard
of
Museums)
Harvard
University
© President and Fellows of Harvard College (Harvard University Art Museums)
President
P1970.4528.0005
©
DJENEAN

Google

© President and Fellows of Harvard College (Harvard University Art Museums) P1970.4528.0005
©
President
and
Fellows
of
Harvard
College
(
University
Art
Museums
)
P1970.4528.0005