Human Generated Data

Title

Cloud Study

Date

c. 1988

People

Artist: Elena Prentice, American born 1946

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Elena Prentice in memory of Gweneth Knight, 2004.183.18

Human Generated Data

Title

Cloud Study

People

Artist: Elena Prentice, American born 1946

Date

c. 1988

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Canvas 99.3
Art 66
Pottery 63.1

Clarifai
created on 2018-03-23

desktop 96.5
texture 96.4
picture frame 94.9
vintage 94.2
dirty 94.2
pattern 93.9
medicine 93.2
abstract 91.4
paper 90.7
retro 90.4
blank 89.9
wear 89.3
graphic 89.3
no person 87.7
margin 87.7
art 87.5
design 86.4
sign 85.9
square 85.5
shape 85.4

Imagga
created on 2018-03-23

bag 54.3
purse 49.1
container 45.5
paper 30.1
grunge 28.1
blank 26.6
texture 26.4
vintage 24.8
old 23.7
antique 20.8
brown 20.6
frame 18.3
aged 18.1
ancient 17.3
retro 17.2
page 16.7
dirty 15.4
design 14.7
grungy 14.2
piece of cloth 14.2
textured 14
object 13.9
pad 13.8
close 13.7
border 13.6
space 13.2
text 13.1
art 13.1
letter 12.8
piece 12.1
detail 12.1
empty 12
rough 11.8
fabric 11.8
torn 11.6
business 11.5
parchment 11.5
closeup 11.5
black 11.4
sheet 11.3
wall 11.1
burnt 10.7
worn 10.5
money 10.2
mailbag 10.1
nobody 10.1
sign 9.8
key 9.3
number 9.3
yellow 9.3
grain 9.2
material 8.9
color 8.9
wallet 8.9
garment 8.9
computer 8.8
crumpled 8.7
scrapbook 8.7
leather 8.5
card 8.5
diaper 8.5
keyboard 8.4
wallpaper 8.4
document 8.4
honeycomb 8
rubber eraser 7.9
burned 7.9
keypad 7.8
used 7.7
framework 7.7
aging 7.7
type 7.7
canvas 7.6
pattern 7.5
one 7.5
backdrop 7.4
cover 7.4
natural 7.4
note 7.3
breakfast 7.1

Google
created on 2018-03-23

sky 59.9

Captions

Microsoft

a piece of cake 42.8%
a piece of cake on a table 27.6%
a piece of cake on a white surface 25%