Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4529.5

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4529.5

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Person 98
Person 97.5
Person 95.6
Person 95.6
Person 94.9
Person 93
Machine 92.7
Wheel 92.7
Musical Instrument 89.4
Person 83
Face 73
Head 73
Leisure Activities 71.9
Music 71.9
Musician 71.9
Performer 71.9
Person 71.1
Gong 57.5
Clothing 57.2
Hat 57.2
Sun Hat 55.8
Drum 55.4
Percussion 55.4
Group Performance 55.1
Music Band 55.1

Clarifai
created on 2018-05-10

people 99.9
adult 99.2
group together 99.2
group 98.1
monochrome 97.6
man 96.7
many 96.3
vehicle 93.8
street 93
child 92.7
administration 91.1
war 90.1
several 89.6
woman 89.1
transportation system 87.3
wear 85.9
military 85.9
cavalry 82.9
weapon 81.8
crowd 80.5

Imagga
created on 2023-10-06

chime 61.9
percussion instrument 51
musical instrument 38.4
plow 31.2
vehicle 28.9
tool 25.9
tractor 20.6
machine 20
landscape 18.6
industry 17.1
old 16.7
transportation 16.1
tree 15.4
rural 15
outdoors 14.9
equipment 14.8
industrial 14.5
wheeled vehicle 14
transport 13.7
sky 13.4
grass 12.6
tricycle 12.5
fence 12.2
building 12
snow 11.9
work 11.8
architecture 11.7
field 11.7
outdoor 11.5
heavy 11.4
stone 10.8
machinery 10.7
farm 10.7
picket fence 10.3
outside 10.3
winter 10.2
light 10
city 10
truck 9.7
agriculture 9.6
wheel 9.4
construction 9.4
device 9.3
power 9.2
travel 9.1
park 9.1
structure 8.9
gravestone 8.8
wooden 8.8
country 8.8
water 8.7
scene 8.6
ancient 8.6
summer 8.4
memorial 8.3
shovel 8.3
landmark 8.1
yellow 7.9
black 7.8
season 7.8
tire 7.8
cold 7.7
dirt 7.6
cemetery 7.4
barrier 7.3
sun 7.2
dirty 7.2
working 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 98.1
old 49.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Male, 87.5%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 45-53
Gender Female, 92.7%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 16-24
Gender Female, 85.5%
Angry 77.4%
Calm 11.4%
Surprised 6.5%
Disgusted 6.3%
Fear 6.1%
Sad 2.5%
Happy 1.6%
Confused 1%

AWS Rekognition

Age 33-41
Gender Female, 99.4%
Happy 80.5%
Disgusted 9.7%
Surprised 7%
Fear 6.1%
Calm 5%
Sad 2.4%
Angry 1%
Confused 0.9%

AWS Rekognition

Age 18-26
Gender Female, 84.1%
Fear 94.6%
Calm 9.6%
Surprised 6.6%
Sad 3%
Disgusted 1.2%
Happy 0.7%
Angry 0.7%
Confused 0.7%

AWS Rekognition

Age 43-51
Gender Male, 77.8%
Sad 100%
Surprised 6.3%
Fear 6%
Calm 1%
Disgusted 0.7%
Confused 0.2%
Happy 0.2%
Angry 0.2%

Feature analysis

Amazon

Adult 98.2%
Male 98.2%
Man 98.2%
Person 98.2%
Wheel 92.7%

Categories

Captions

Microsoft
created on 2018-05-10

an old photo of a person 83.5%
old photo of a person 81%
an old photo of a boy 56%

Text analysis

Amazon

College
Art
and
(Harvard
Fellows
of
Museums)
University
Harvard
President
© President and Fellows of Harvard College (Harvard University Art Museums)
SEKAR
CULANC
SEKAR - CULANC
P1970.4529.0005
©
-

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4529.0005
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4529.0005