Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4520.4

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4520.4

Machine Generated Data

Tags

Amazon
created on 2023-10-06

People 99
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 98.1
Adult 98.1
Male 98.1
Man 98.1
Formal Wear 97.8
Walking 97.8
Person 97.8
Clothing 97.7
Dress 97.7
Person 96.7
Person 96.3
Person 96.2
Fashion 96
Gown 96
Person 95.8
Person 95.6
Person 95.3
Male 95.3
Boy 95.3
Child 95.3
Person 94.8
Person 94.5
Person 94.4
Coat 93.7
Person 93
Person 91.8
Person 89
Person 87.9
Person 87.6
Person 84.5
Face 79
Head 79
Footwear 77.8
Shoe 77.8
Person 76.5
Person 76.1
Person 71.9
Person 69.7
Person 69.4
Person 69.4
Person 68.8
Crowd 64.7
Shoe 62.9
Shoe 56.9
Hat 56.9
Accessories 56.7
Bag 56.7
Handbag 56.7
Bus Stop 55.5
Outdoors 55.5
Suit 55.3
Market 55

Clarifai
created on 2018-05-10

people 100
group 99.4
many 99.1
group together 98.3
adult 97.7
man 97
military 94.8
war 92.6
soldier 92.3
child 91.8
woman 91.7
wear 91.4
several 90.2
crowd 88.5
administration 87.2
uniform 83.6
leader 82.8
weapon 81.6
street 80.5
boy 80

Imagga
created on 2023-10-06

musical instrument 27.1
percussion instrument 24.7
marimba 20.3
crutch 18.3
people 17.8
travel 15.5
staff 13.6
man 13.4
city 13.3
old 13.2
history 12.5
walking 12.3
world 12.2
person 11.6
stick 11.5
statue 11.4
wind instrument 11.1
danger 10.9
tourism 10.7
adult 10.1
dark 10
male 9.9
religion 9.9
building 9.8
sculpture 9.6
men 9.4
holiday 9.3
tourist 9.2
silhouette 9.1
dirty 9
weapon 8.8
fountain 8.6
historical 8.5
animal 8.4
trombone 8.4
protection 8.2
vacation 8.2
water 8
steel drum 7.9
urban 7.9
destruction 7.8
black 7.8
architecture 7.8
color 7.8
horse 7.7
stone 7.6
brass 7.5
structure 7.5
symbol 7.4
transport 7.3
landmark 7.2
transportation 7.2
night 7.1
sport 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 87.1
group 59.9
people 59.7
crowd 21.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 51-59
Gender Male, 52.2%
Happy 66.7%
Calm 21.9%
Surprised 6.9%
Fear 6%
Angry 5.9%
Disgusted 2.4%
Sad 2.3%
Confused 1.3%

AWS Rekognition

Age 41-49
Gender Male, 97.4%
Calm 53.8%
Angry 26.5%
Happy 9.1%
Surprised 6.5%
Fear 6%
Sad 5.3%
Confused 1.7%
Disgusted 1.3%

AWS Rekognition

Age 26-36
Gender Male, 89.1%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Disgusted 0%
Confused 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 11-19
Gender Male, 82.8%
Happy 49.7%
Calm 31.8%
Angry 10.3%
Surprised 6.8%
Fear 6.1%
Sad 4.2%
Disgusted 1.2%
Confused 0.5%

AWS Rekognition

Age 22-30
Gender Female, 78%
Sad 88.8%
Calm 53%
Surprised 7%
Fear 6.1%
Disgusted 1.3%
Confused 1.1%
Happy 0.8%
Angry 0.6%

AWS Rekognition

Age 9-17
Gender Male, 97.9%
Calm 92%
Surprised 6.4%
Happy 6%
Fear 6%
Sad 2.3%
Angry 0.8%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 29-39
Gender Male, 96.3%
Happy 32%
Calm 28.8%
Angry 16.7%
Surprised 10.5%
Fear 6.3%
Disgusted 5.4%
Sad 4.8%
Confused 3.7%

AWS Rekognition

Age 26-36
Gender Male, 95.9%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 31-41
Gender Female, 81.2%
Calm 99%
Surprised 6.3%
Fear 6%
Sad 2.2%
Happy 0.3%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 28-38
Gender Male, 99.9%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Disgusted 0%
Confused 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 29-39
Gender Male, 88.5%
Calm 92%
Surprised 7.3%
Fear 5.9%
Sad 2.2%
Happy 2.1%
Confused 1.5%
Angry 1%
Disgusted 1%

AWS Rekognition

Age 33-41
Gender Male, 99.7%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0%
Happy 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 29-39
Gender Male, 99.1%
Calm 97.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 1.6%
Disgusted 0.3%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 24-34
Gender Female, 87.9%
Calm 71.9%
Surprised 19%
Fear 6.2%
Happy 4.1%
Sad 4%
Disgusted 1.6%
Angry 1.6%
Confused 0.9%

AWS Rekognition

Age 21-29
Gender Male, 61.1%
Happy 44.7%
Calm 25.7%
Sad 11.9%
Confused 9.5%
Surprised 7.2%
Fear 6.1%
Disgusted 3.9%
Angry 1%

Feature analysis

Amazon

Person 98.8%
Adult 98.8%
Male 98.8%
Man 98.8%
Boy 95.3%
Child 95.3%
Coat 93.7%
Shoe 77.8%

Text analysis

Amazon

College
and
Art
Fellows
(Harvard
Museums)
of
Harvard
University
© President and Fellows of Harvard College (Harvard University Art Museums)
President
P1970.4520.0004
©

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4520.0004
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4520.0004