Human Generated Data

Title

Untitled #5

Date

1972

People

Artist: Brice Marden, American 1938 - 2023

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Deknatel Purchase Fund, M15871

Copyright

© Brice Marden / Artists Rights Society (ARS), New York

Human Generated Data

Title

Untitled #5

People

Artist: Brice Marden, American 1938 - 2023

Date

1972

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Deknatel Purchase Fund, M15871

Copyright

© Brice Marden / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Art 98.3
Modern Art 94
Drawing 92.9
Sketch 82.9
Painting 58.5
Canvas 57.7

Clarifai
created on 2019-10-30

art 97.1
no person 97
painting 96.9
people 96.3
wear 96.3
one 94.2
adult 92.9
furniture 91.2
room 90.6
bill 88.4
illustration 87.5
print 86.9
graffiti 85.3
artistic 84.9
dirty 84.3
chalk out 83.4
picture frame 83.3
woman 82.5
messy 81.9
abstract 81.7

Imagga
created on 2019-10-30

sketch 71.4
drawing 57.9
representation 43
billboard 30.3
vintage 28.9
book jacket 28.3
old 27.2
signboard 25.5
structure 23.3
grunge 23
jacket 22
texture 19.4
retro 18.8
paper 18.1
wrapping 16.7
ancient 16.4
black 16.2
frame 15.8
antique 15.6
aged 14.5
dirty 14.4
design 13
letter 12.8
cassette tape 12.7
covering 12.4
grungy 12.3
business 11.5
damaged 11.4
wall 11.1
art 11.1
message 11
stamp 10.7
magnetic tape 10.3
pattern 10.2
note 10.1
envelope 9.9
sign 9.8
text 9.6
graphic 9.5
blank 9.4
symbol 9.4
architecture 9.4
card 9.3
history 8.9
material 8.9
office 8.8
mail 8.6
dirt 8.6
space 8.5
money 8.5
page 8.3
city 8.3
paint 8.1
border 8.1
building 8.1
book 8
memory device 8
postmark 7.9
empty 7.7
parchment 7.7
post 7.6
decoration 7.5
device 7.4
investment 7.3
film 7.3
global 7.3
currency 7.2

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

drawing 99.6
painting 99.2
sketch 99
gallery 98.7
room 97.9
scene 97.7
art 97
text 90.2
child art 82.1
black and white 50.9

Color Analysis

Captions

Text analysis

Amazon

elar