Human Generated Data

Title

Pews

Date

1970

People

Artist: Edward Ruscha, American born 1937

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M26345

Copyright

© Ed Ruscha

Human Generated Data

Title

Pews

People

Artist: Edward Ruscha, American born 1937

Date

1970

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M26345

Copyright

© Ed Ruscha

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Rug 96.3
Text 96.2
Art 68.5
Paper 68.1
Advertisement 67.8
Poster 63.7
Alphabet 60.7

Clarifai
created on 2018-03-16

cardboard 99.6
paper 99.2
post 98.3
page 97.4
delivery 97.1
blank 94.8
sheet (pane) 94.8
carton 94.6
note 94.1
texture 93.5
reminder 93.1
card 92.9
box 91.7
bundle 90.9
message 90.6
packet 90.3
display 90
retro 88.8
memo 88.8
corrugated 88.6

Imagga
created on 2018-03-16

board 57.6
doormat 49
paper 41.6
mat 39.8
insulating material 31.7
grunge 31.6
texture 31.3
blank 30.9
container 30.6
carton 30
floor cover 29.4
old 27.2
brown 26.6
box 25.5
envelope 24.7
retro 24.6
empty 24.1
covering 24.1
building material 23.8
vintage 22.4
antique 21.7
frame 20.9
pattern 20.6
surface 20.3
cardboard 20.2
textured 19.3
message 19.2
dirty 19
page 18.6
design 18.6
parchment 18.3
rough 18.3
material 17.9
ancient 17.3
aged 17.2
copy 16.8
note 16.6
space 16.3
backdrop 14.9
detail 14.5
card 14.5
post 13.4
wallpaper 13
document 13
wall 12.9
close 12.6
wood 12.5
text 12.2
damaged 11.5
grungy 11.4
sheet 11.3
closeup 10.8
torn 10.7
beige 10.6
yellow 10.6
backgrounds 10.6
sign 10.6
weathered 10.5
office 10.5
letter 10.1
decoration 10
binding 9.5
nobody 9.3
business 9.1
art 9.1
element 9.1
border 9.1
object 8.8
crumpled 8.8
sand 8.7
notice 8.7
education 8.7
write 8.5
horizontal 8.4
grain 8.3
copy space 8.1
package 8.1
symbol 8.1
shipping 7.8
obsolete 7.7
mail 7.7
rusty 7.6
old fashioned 7.6
communication 7.6

Google
created on 2018-03-16

text 89.9
font 83.1
paper 63.2
calligraphy 52.4
art 50.1
paper product 50

Microsoft
created on 2018-03-16

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 57-77
Gender Female, 89.6%
Surprised 3.9%
Happy 6.5%
Angry 10.1%
Sad 22.1%
Calm 52.8%
Confused 3.6%
Disgusted 1.1%

Feature analysis

Amazon

Rug 96.3%

Categories

Imagga

text visuals 99.4%
food drinks 0.3%
paintings art 0.2%

Captions

Microsoft
created on 2018-03-16

a close up of a box 56.7%
close up of a box 48.6%
a screenshot of a video game box 7.9%

Text analysis

Amazon

ctus