Human Generated Data

Title

[View from train]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.145.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View from train]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.145.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Handrail 100
Banister 100
Railing 90.8
Person 64.3
Human 64.3
Window 64
Staircase 57.3
Prison 55.9

Clarifai
created on 2021-04-04

monochrome 99.7
people 98.1
street 97
room 96.4
no person 96.3
architecture 94.3
indoors 94
window 93.6
light 93.5
black and white 92.8
analogue 92.5
grinder 91.8
urban 91.8
art 91.6
city 88.5
building 87.4
abandoned 87.2
empty 84.2
reflection 84.1
business 83.9

Imagga
created on 2021-04-04

prison 38.7
correctional institution 30.3
old 28.6
architecture 27.6
building 24.7
penal institution 22.7
city 22.4
device 20.8
wall 17.1
travel 16.9
house 16.7
metal 15.3
institution 15.1
window 14.9
cell 14.5
negative 13.9
film 13.7
factory 13.5
industrial 12.7
water 12.7
power 12.6
steel 12.5
ancient 12.1
street 12
industry 11.9
equipment 11.8
aged 11.8
door 11.5
gate 11.4
glass 11.4
construction 11.1
grunge 11.1
energy 10.9
tourism 10.7
structure 10.6
urban 10.5
technology 10.4
town 10.2
texture 9.7
plant 9.7
black 9.6
turnstile 9.6
light 9.4
stone 9.3
wood 9.2
art 9.2
vintage 9.1
history 8.9
interior 8.8
home 8.8
tube 8.6
photographic paper 8.3
elevator 8.2
historic 8.2
environment 8.2
dirty 8.1
refinery 7.9
antique 7.8
pollution 7.7
culture 7.7
brick 7.7
establishment 7.7
bridge 7.6
electricity 7.6
dark 7.5
destination 7.5
oil 7.4
sea 7
sky 7
modern 7

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

Color Analysis

Feature analysis

Amazon

Person 64.3%

Categories

Imagga

interior objects 99.2%

Captions