Human Generated Data

Title

Canto XVII

Date

1964

People

Artist: Robert Rauschenberg, American 1925 - 2008

Publisher: Harry N. Abrams, Inc.,

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Theodore E. Stebbins Jr. in honor of Marjorie Cohn, 2010.449.17

Copyright

© Robert Rauschenberg / Artists Rights Society (ARS), New York

Human Generated Data

Title

Canto XVII

People

Artist: Robert Rauschenberg, American 1925 - 2008

Publisher: Harry N. Abrams, Inc.,

Date

1964

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Theodore E. Stebbins Jr. in honor of Marjorie Cohn, 2010.449.17

Copyright

© Robert Rauschenberg / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Human 97.9
Art 96.7
Drawing 96.7
Sketch 94.8
Person 82.8
Animal 81.1
Bird 81.1
Person 60.7
Painting 55

Clarifai
created on 2018-03-23

print 99.5
painting 98.9
illustration 98.7
art 98.7
people 94.9
wear 94.7
adult 94.2
lithograph 94.1
artistic 93.2
vintage 92.1
no person 92
one 91
paper 90.6
man 87.6
text 85
antique 84.9
water 84.2
woman 83.9
bird 83.3
chalk out 82.8

Imagga
created on 2018-03-23

beach 26
texture 25
old 23
grunge 22.1
ocean 21.8
sand 20.2
decoration 19.6
sea 18.8
tropical 18.7
dirty 18.1
frame 17.5
vintage 16.5
paper 16.5
water 16
graffito 16
envelope 15.1
holiday 15
antique 14.7
drawing 14.3
aged 13.6
snow 13.5
design 13.5
vacation 13.1
wallpaper 13
pattern 13
textured 12.3
tree 12.1
ornament 12.1
travel 12
winter 11.9
season 11.7
space 11.6
material 11.6
summer 11.6
retro 11.5
damaged 11.4
art 11.2
color 10.6
sun 10.5
paradise 10.4
outdoor 9.9
container 9.6
ancient 9.5
wave 9.5
natural 9.4
page 9.3
flower 9.2
border 9
sky 9
landscape 8.9
backgrounds 8.9
brown 8.8
surface 8.8
artistic 8.7
light 8.7
stain 8.6
day 8.6
cold 8.6
grungy 8.5
floral 8.5
sketch 8.4
backdrop 8.2
shape 8.2
symbol 8.1
close 8
ice 7.9
leaf 7.8
torn 7.7
card 7.6
worn 7.6
rusty 7.6
element 7.4
shore 7.4
style 7.4
rough 7.3
detail 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

text 98.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-38
Gender Male, 56.7%
Angry 2.3%
Happy 57.8%
Calm 22.2%
Surprised 4.6%
Sad 7.9%
Disgusted 3.2%
Confused 1.9%

AWS Rekognition

Age 60-90
Gender Female, 58.6%
Confused 2.3%
Calm 6.9%
Surprised 3.8%
Happy 26.8%
Sad 19.1%
Disgusted 38.1%
Angry 3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 82.8%
Bird 81.1%

Categories

Imagga

paintings art 99.9%
interior objects 0.1%

Captions

Microsoft
created on 2018-03-23

a close up of a book 25.2%
close up of a book 21.2%
a hand holding a book 21.1%