Human Generated Data

Title

Untitled

Date

1995

People

Artist: Beverley Semmes, American 1958 -

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sarah-Ann and Werner H. Kramarsky, M25070

Human Generated Data

Title

Untitled

People

Artist: Beverley Semmes, American 1958 -

Date

1995

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Art 82.5
Drawing 82.5
Apron 73.4
Furniture 65.7
Rug 61.8
Doodle 56.2

Clarifai
created on 2021-04-03

vector 98.9
illustration 98.9
design 98.3
art 97.5
chalk out 97.1
graphic 96.6
no person 96.3
retro 95.9
decoration 95.7
silhouette 95.2
pattern 95.1
scribble 94.4
element 93.4
vintage 90.5
print 89.9
outline 89.7
style 88.9
symbol 88.9
desktop 88
abstract 85.9

Imagga
created on 2021-04-03

drawing 94.6
sketch 93.4
representation 51.7
cartoon 31.2
art 25.9
berry 20.9
outline 19.9
design 18.6
black 18.6
line 18
edible fruit 16.7
silhouette 15.7
style 14.8
fashion 13.6
clip 13
clip art 13
symbol 12.1
man 11.5
icon 11.1
shape 10.3
women 10.3
fruit 10
fun 9.7
contour 9.7
pattern 8.9
comic 8.7
men 8.6
male 8.5
set 8.5
hand 8.4
hat 8.2
retro 8.2
graphic 8
love 7.9
holiday 7.9
animal 7.9
ornament 7.8
modern 7.7
arabesque 7.7
antique 7.6
element 7.4
clothing 7.4
object 7.3
artwork 7.3
cheerful 7.3
smiling 7.2
painting 7.2
fantasy 7.2
cute 7.2
person 7.1

Google
created on 2021-04-03

White 92.2
Black 89.5
Sleeve 86.5
Art 85.8
Dress 85.6
Painting 85.4
Gesture 85.3
Font 84.2
Style 84.1
Line 82.4
Black-and-white 82.2
Fashion design 79.2
Pattern 78.3
Illustration 75.5
One-piece garment 75.1
Monochrome 73.3
Line art 73.3
Human leg 73.3
Waist 73.1
Drawing 72.4

Microsoft
created on 2021-04-03

sketch 98.3
drawing 98.3
cartoon 96.7
art 91.4
illustration 85.7
ink 76.1
text 68.2
child art 66.3
fabric 13.6

Feature analysis

Amazon

Rug 61.8%

Captions

Microsoft

a close up of a logo 77.3%
a close up of a piece of paper 66.9%
close up of a logo 66.8%