Human Generated Data

Title

Untitled (Barong Dance, Bali)

Date

February 2, 1960-February 17, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4540.2

Human Generated Data

Title

Untitled (Barong Dance, Bali)

People

Artist: Ben Shahn, American 1898 - 1969

Date

February 2, 1960-February 17, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4540.2

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 97.6
Male 97.6
Man 97.6
Person 97.6
Person 94
Adult 93.5
Person 93.5
Female 93.5
Woman 93.5
Person 83.9
Face 82.9
Head 82.9
Person 73.2
Person 66
Clothing 65
Footwear 65
Shoe 65
People 62.3
Person 61.1
Body Part 57.5
Finger 57.5
Hand 57.5
Hat 56.4
Animal 56.3
Bird 56.3
Pigeon 56.3
Crowd 56.1
Tartan 55.6

Clarifai
created on 2018-05-10

people 99.8
adult 98.4
man 97.2
woman 96.8
wear 96
group 95.8
monochrome 95.7
war 92.7
military 91.2
one 89.3
group together 89.1
indoors 85.7
many 85.1
street 84.7
child 83.9
dark 82.3
two 80.4
old 79.6
recreation 79.1
skirmish 76.7

Imagga
created on 2023-10-05

cave 100
geological formation 100
dark 25.1
man 20.2
light 17.4
old 16.7
mine 15.9
stone 15.2
darkness 13.7
mystery 13.5
travel 13.4
water 13.4
rock 13
dirty 12.7
park 12.4
tree 12.3
night 11.6
person 11.4
ancient 11.2
excavation 11.1
grunge 11.1
danger 10.9
wet 10.7
tourism 10.7
mysterious 10.7
landscape 10.4
adult 10.4
geology 9.7
horror 9.7
wall 9.5
architecture 9.4
natural 9.4
male 9.2
outdoor 9.2
earth 9.1
silhouette 9.1
black 9
people 8.9
mask 8.8
fog 8.7
culture 8.5
historical 8.5
fire 8.4
monument 8.4
sky 8.3
inside 8.3
protection 8.2
style 8.2
fantasy 8.1
autumn 7.9
fear 7.7
art 7.7
ground 7.6
power 7.6
vintage 7.4
industrial 7.3
tourist 7.3
color 7.2
building 7.2
history 7.2
cool 7.1
scenic 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

nature 97.8
cave 24.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Calm 99.4%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Confused 0.1%
Happy 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 30-40
Gender Male, 99.5%
Calm 57.6%
Sad 39.5%
Fear 13.9%
Surprised 6.3%
Angry 2.6%
Disgusted 1.4%
Happy 0.2%
Confused 0.1%

AWS Rekognition

Age 35-43
Gender Male, 86.7%
Sad 98.5%
Calm 15.5%
Fear 11.9%
Surprised 7.1%
Angry 5.3%
Happy 5.1%
Confused 3.8%
Disgusted 1.4%

AWS Rekognition

Age 33-41
Gender Male, 86.5%
Calm 83.8%
Surprised 6.5%
Fear 6.1%
Sad 5.4%
Confused 4.5%
Happy 2.5%
Disgusted 1%
Angry 0.4%

AWS Rekognition

Age 24-34
Gender Male, 97.2%
Calm 65.8%
Sad 17.6%
Confused 11.6%
Surprised 7.4%
Fear 6.2%
Disgusted 1.7%
Happy 1.4%
Angry 0.8%

AWS Rekognition

Age 27-37
Gender Male, 97.8%
Sad 90%
Calm 18.9%
Fear 14.1%
Confused 9.1%
Disgusted 8.8%
Surprised 6.9%
Happy 3.7%
Angry 1.8%

AWS Rekognition

Age 22-30
Gender Male, 88.2%
Calm 89.8%
Fear 8.1%
Surprised 6.4%
Disgusted 3.7%
Sad 2.3%
Confused 0.4%
Happy 0.3%
Angry 0.2%

AWS Rekognition

Age 18-24
Gender Male, 87.8%
Calm 37.6%
Confused 27.5%
Fear 12.6%
Surprised 7.5%
Disgusted 6.6%
Sad 6.1%
Happy 3.8%
Angry 2.2%

Feature analysis

Amazon

Adult 97.6%
Male 97.6%
Man 97.6%
Person 97.6%
Female 93.5%
Woman 93.5%
Shoe 65%

Captions

Microsoft
created on 2018-05-10

a person standing in a room 62.4%
an old photo of a person 43.6%
an old photo of a person 39.5%

Text analysis

Amazon

College
Art
and
Fellows
(Harvard
Museums)
of
University
President
Harvard
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4540.0002
©

Google

© President and Fellows of Harvard College (Harvard University Art Museums) P1970.4540.0002
©
President
and
Fellows
of
Harvard
College
(
University
Art
Museums
)
P1970.4540.0002