Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4532.3

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4532.3

Machine Generated Data

Tags

Amazon
created on 2023-10-06

People 99.8
Person 98.9
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.5
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Person 98.2
Adult 98.2
Adult 98.2
Bride 98.2
Female 98.2
Female 98.2
Wedding 98.2
Woman 98.2
Person 98.1
Adult 98.1
Female 98.1
Woman 98.1
Dancing 98
Leisure Activities 98
Person 97.9
Person 97.6
Clothing 96.9
Shorts 96.9
Person 94.1
Adult 94.1
Adult 94.1
Bride 94.1
Female 94.1
Female 94.1
Woman 94.1
Person 82.5
Face 76
Head 76
Dance Pose 56
Back 55.8
Body Part 55.8
Monk 55.2
Skirt 55.1
Festival 55

Clarifai
created on 2018-05-10

people 99.9
group together 99
group 97.2
many 97
crowd 94.9
man 94.7
adult 94
wear 92.8
street 90.7
music 89.7
dancer 86.8
dancing 86.7
competition 86.1
parade 85
woman 84.7
motion 84.5
outfit 81.2
costume 81.1
athlete 81.1
child 80.9

Imagga
created on 2023-10-06

runner 29
athlete 27.5
horse 24.8
person 17.2
contestant 17.1
crutch 17
man 15.4
people 14.5
animal 14.1
vaulting horse 13.7
sport 12.6
stick 12.5
staff 12.4
outdoor 12.2
outdoors 12
trombone 12
brass 11.8
male 11.3
grunge 11.1
black 10.8
adult 10.5
old 10.4
art 10.4
wind instrument 9.5
field 9.2
dark 9.2
competition 9.1
gymnastic apparatus 8.6
sports equipment 8.6
summer 8.4
pedestrian 8.1
dirty 8.1
mammal 7.9
grass 7.9
world 7.7
travel 7.7
run 7.7
musical instrument 7.7
fountain 7.5
farm 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 90

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Female, 95.1%
Calm 50.2%
Surprised 33.8%
Confused 9.7%
Fear 8.5%
Sad 3.7%
Happy 3.2%
Angry 2%
Disgusted 1.5%

AWS Rekognition

Age 31-41
Gender Female, 94.6%
Calm 87.2%
Surprised 6.9%
Fear 6%
Happy 3.7%
Confused 2.6%
Sad 2.5%
Angry 2.1%
Disgusted 1.8%

AWS Rekognition

Age 31-41
Gender Male, 96.7%
Fear 88.3%
Sad 10%
Surprised 7.3%
Angry 3.9%
Happy 3.4%
Calm 3.1%
Confused 2.7%
Disgusted 2.1%

AWS Rekognition

Age 38-46
Gender Female, 60.2%
Fear 97.5%
Surprised 6.4%
Sad 2.4%
Calm 1.1%
Angry 0.9%
Happy 0.6%
Disgusted 0.5%
Confused 0.2%

AWS Rekognition

Age 22-30
Gender Female, 88%
Fear 80.4%
Sad 21.5%
Surprised 8.1%
Calm 5.3%
Happy 5%
Angry 2.4%
Disgusted 2%
Confused 1.9%

Feature analysis

Amazon

Person 98.9%
Adult 98.8%
Male 98.8%
Man 98.8%
Bride 98.2%
Female 98.2%
Woman 98.2%

Categories

Text analysis

Amazon

College
and
Art
(Harvard
Fellows
Museums)
of
Harvard
University
President
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4532.0003
©
RADETS

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4532.0003
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4532.0003