Human Generated Data

Title

Untitled

Date

November 9 1983

People

Artist: Jeffrey Lyle Gerlinger, American born 1952

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Acquired through the Deknatel Purchase Fund, 1985.29

Human Generated Data

Title

Untitled

People

Artist: Jeffrey Lyle Gerlinger, American born 1952

Date

November 9 1983

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Acquired through the Deknatel Purchase Fund, 1985.29

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Soil 98.9
Art 78.8
Painting 78.8
Human 78.7
Drawing 75.2
Sketch 65.7
Tree 59
Plant 59
Person 44.1

Clarifai
created on 2019-06-01

anatomy 99.7
science 99
medicine 97.9
illustration 97.6
art 97.6
medical 97.5
desktop 96.3
disease 96.2
abstract 95.9
texture 95.2
neurology 94.5
pattern 94.5
biology 94.1
head 92.8
human 92.2
visuals 91.1
design 91
people 90.1
artistic 90.1
genius 89.8

Imagga
created on 2019-06-01

sketch 100
drawing 76.8
representation 63.7
sculpture 41.8
ancient 39.8
stone 38.8
temple 36.5
art 33.5
religion 33.2
statue 28.9
architecture 26.7
culture 26.5
carving 26.2
old 24.4
history 23.3
relief 22.3
god 20.1
religious 18.7
monument 18.7
travel 17.6
carved 17.6
face 14.2
texture 13.9
pattern 13.7
decoration 13.5
design 13.3
sandstone 12.8
century 12.7
historic 11.9
tourism 11.6
spirituality 11.5
close 11.4
east 11.2
decorative 10.9
ruins 10.7
ruin 10.7
heritage 10.6
meditation 10.5
detail 10.5
historical 10.4
building 10.3
traditional 10
southeast 9.9
spiritual 9.6
antique 9.5
oriental 9.4
famous 9.3
carve 8.8
wall 8.6
backdrop 8.2
style 8.2
artistic 7.8
wisdom 7.8
holy 7.7
figure 7.7
marble 7.7
head 7.6
china 7.5
vintage 7.5
church 7.4
exterior 7.4
decor 7.1
surface 7.1

Google
created on 2019-06-01

Drawing 88.7
Sketch 78.5
Line 75.5
Jaw 57.4
Black-and-white 56.4
Artwork 54.3
Illustration 50.4
Art 50.2

Microsoft
created on 2019-06-01

sketch 99.5
drawing 99.1
abstract 95
art 90.9
ink 69.3
painting 63
tree 50
plant 41.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 54.2%
Disgusted 3.1%
Calm 39.2%
Surprised 8.9%
Angry 18.4%
Happy 2.5%
Sad 20.4%
Confused 7.5%

Feature analysis

Amazon

Painting 78.8%
Person 44.1%

Categories

Imagga

pets animals 65.2%
people portraits 18.9%
paintings art 14.8%

Captions

Microsoft
created on 2019-06-01

a close up of a tree 56.5%
close up of a tree 49.1%
a close up of a tree trunk 42.2%