Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

[Train]

Date

Unknown

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.504.19

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Train]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

Unknown

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.504.19

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-11

Clarifai
created on 2019-11-11

photograph 100
negative 99.9
filmstrip 99.9
movie 99.8
slide 99.4
exposed 99.3
cinematography 99.2
margin 98.9
desktop 98.3
moment 98
emulsion 97.8
old 97.8
noisy 97.7
collage 97
picture frame 96.5
bobbin 96.4
analogue 96.3
art 96.2
dark 96.1
monochrome 95.2

Imagga
created on 2019-11-11

negative 100
film 100
photographic paper 84.2
photographic equipment 56.1
old 20.9
frame 20.8
strip 19.4
architecture 18.3
retro 17.2
border 17.2
black 16.8
city 16.6
vintage 16.5
movie 16.5
grunge 16.2
camera 15.7
slide 15.6
filmstrip 14.8
art 13.7
design 13.5
tower 13.4
silhouette 13.3
photographic 12.8
texture 12.5
building 12.2
digital 12.2
entertainment 12
cinema 11.7
graphic 11.7
tourism 11.6
damaged 11.5
urban 11.4
travel 11.3
35mm 10.8
dirty 10.8
roll 10.4
cityscape 10.4
antique 10.4
blank 10.3
space 10.1
equipment 9.8
photograph 9.6
sky 9.6
skyline 9.5
banner 9.2
business 9.1
element 9.1
technology 8.9
noise 8.8
effect 8.2
rough 8.2
computer 8
reel 7.9
text 7.9
ship 7.8
video 7.7
tape 7.7
liner 7.6
buildings 7.6
pattern 7.5
screen 7.5
aged 7.2
color 7.2
structure 7.2

Google
created on 2019-11-11

Microsoft
created on 2019-11-11

text 99.7
ship 88.8
black and white 60.3

Color Analysis

Feature analysis

Amazon

Tank
Tank 55%

Categories

Imagga

interior objects 98.1%

Captions

Text analysis

Amazon

13

Google

112
112