Human Generated Data

Title

Untitled (Horyuji Temple, Nara, Japan)

Date

March 18, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5442

Human Generated Data

Title

Untitled (Horyuji Temple, Nara, Japan)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 18, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5442

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Art 99.1
Person 97.4
Adult 97.4
Male 97.4
Man 97.4
Painting 97
Drawing 88.1
Sculpture 57
Statue 57
Archaeology 56.2
Architecture 56.1
Building 56.1
Monastery 56.1
Skin 55.2
Tattoo 55.2

Clarifai
created on 2018-05-10

people 98.9
portrait 98.5
one 97
art 96.6
man 96.3
illustration 95.2
old 93.8
adult 92.8
monochrome 90.4
antique 89.5
sculpture 89.2
retro 87.8
visuals 87.6
ancient 87.3
paper 86.1
vintage 85.5
face 84.5
painting 82.4
war 81.1
no person 81

Imagga
created on 2023-10-06

sculpture 76
mask 69.6
statue 56.7
bust 56.5
covering 55.6
disguise 49.5
art 39.6
attire 34.3
plastic art 32.2
face 28.4
culture 27.4
ancient 25.1
religion 24.2
clothing 21.2
temple 21
head 21
stone 20.8
figure 18.9
old 18.8
history 17
god 16.3
travel 16.2
architecture 15.6
portrait 15.5
decoration 15.1
religious 15
monument 14
carving 12.7
close 12.6
traditional 12.5
cadaver 12.3
tourism 11.6
antique 11.3
consumer goods 11.2
historic 11
oriental 10.4
eyes 10.3
famous 10.2
man 10.1
decorative 10
cemetery 10
currency 9.9
carved 9.8
human 9.8
one 9.7
spirituality 9.6
china 9.4
church 9.3
male 9.2
look 8.8
golden 8.6
money 8.5
historical 8.5
east 8.4
people 8.4
dollar 8.4
tradition 8.3
cash 8.2
crazy 8
design 7.8
marble 7.8
scary 7.8
closeup 7.4
tourist 7.3

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

text 99.1
book 90.2
old 62.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 63-73
Gender Female, 69%
Calm 94.1%
Surprised 6.9%
Fear 6.1%
Sad 2.6%
Happy 1.3%
Disgusted 0.5%
Angry 0.4%
Confused 0.3%

Feature analysis

Amazon

Person 97.4%
Adult 97.4%
Male 97.4%
Man 97.4%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2018-05-10

an old photo of a person 66.9%
old photo of a person 61.5%
an old photo of a book 33.3%