Human Generated Data

Title

Untitled (Horyuji Temple, Nara, Japan)

Date

March 18, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5445

Human Generated Data

Title

Untitled (Horyuji Temple, Nara, Japan)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 18, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5445

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Skin 94.9
Tattoo 94.9
Furniture 90.6
Art 77.9
Painting 77.9
Wood 68.2
Chair 62.6
Lady 57.9
Body Part 56.6
Finger 56.6
Hand 56.6

Clarifai
created on 2018-05-10

people 99.6
one 98
portrait 97.1
man 94.8
adult 93.8
art 91.2
wear 89.5
old 87.3
military 85.1
uniform 82.2
war 80.8
leader 80.4
veil 80
music 79.1
retro 79
costume 77.4
administration 77.1
actor 73.5
sit 72.2
outfit 71.9

Imagga
created on 2023-10-06

statue 94.8
sculpture 51.4
art 34.8
religion 34.1
ancient 32
crazy 30.5
culture 29.9
temple 27.6
history 26.9
stone 24.5
carving 24.5
old 24.4
architecture 24.2
religious 20.6
face 20.6
monument 20.6
travel 20.4
god 19.1
figure 17.8
bust 17.1
antique 15.6
decoration 14.9
historic 14.7
plastic art 13.9
spirituality 13.4
head 12.6
holy 12.5
tourism 12.4
portrait 12.3
mask 12.2
east 12.1
oriental 11.8
traditional 11.6
spiritual 11.5
historical 11.3
famous 11.2
church 11.1
tourist 10.9
carved 10.8
china 10.5
one 10.5
memorial 10.1
man 10.1
people 10
pray 9.7
marble 9.7
currency 9
design 8.9
statues 8.9
century 8.8
person 8.8
hundred 8.7
worship 8.7
cadaver 8.6
faith 8.6
golden 8.6
money 8.5
world 8.4
dollar 8.4
vintage 8.3
museum 8.3
cash 8.2
peace 8.2
structure 8.2
sculptures 7.9
southeast 7.9
male 7.8
catholic 7.8
ruin 7.8
prayer 7.7
meditation 7.7
bill 7.6
decorative 7.5
human 7.5
disguise 7.5
city 7.5
close 7.4
covering 7.3
protection 7.3
building 7.2
paper 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

text 97.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 67-77
Gender Female, 100%
Calm 52.2%
Happy 40.1%
Surprised 7.1%
Fear 6.2%
Sad 2.6%
Confused 1.7%
Disgusted 1.2%
Angry 1.1%

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%

Categories

Imagga

paintings art 96.3%
people portraits 3.4%