Human Generated Data

Title

[View from train]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.145.17

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View from train]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 98.6
Human 98.6
Railway 92.3
Rail 92.3
Train Track 92.3
Transportation 92.3
Vehicle 74.2
Train 74.2
Path 72.3
Road 70.7
Clothing 66.6
Apparel 66.6
Tomb 66.1
Tarmac 55.8
Asphalt 55.8

Clarifai
created on 2021-04-04

train 99.8
locomotive 99.6
railway 99.5
monochrome 99.4
no person 98.3
street 97.6
winter 97.4
subway system 96.7
abandoned 95.4
urban 95
empty 93.9
transportation system 93.6
city 92.9
black and white 92.9
graffiti 92.2
architecture 92.1
window 91.9
light 91.3
old 91.2
snow 90.4

Imagga
created on 2021-04-04

submarine 99.8
submersible 78.5
warship 60.9
ship 52.3
military vehicle 39.4
vessel 32.2
car 31.6
passenger car 27.1
sky 24.3
architecture 22.7
travel 22.5
old 20.9
wheeled vehicle 20.4
building 18.4
urban 18.4
vehicle 18
industry 17.1
city 15.8
stone 15.3
tower 15.2
history 15.2
industrial 14.5
landscape 14.1
tourism 14
road 13.6
clouds 13.5
steel 12.4
wall 11.9
transportation 11.7
factory 11.6
structure 11.4
town 11.1
historic 11
train 11
transport 11
station 10.8
street 10.1
track 10.1
energy 10.1
water 10
landmark 9.9
tunnel 9.8
monument 9.3
boat 9.3
power 9.2
exterior 9.2
fortress 9.2
castle 8.9
rural 8.8
tank 8.8
grass 8.7
sea 8.6
storage 8.6
buildings 8.5
environment 8.2
container 8.2
scenery 8.1
day 7.9
chimney 7.8
ancient 7.8
modern 7.7
concrete 7.7
cityscape 7.6
electricity 7.6
house 7.5
passageway 7.5
outdoors 7.5
plant 7.5
famous 7.4
oil 7.4
way 7.4
tourist 7.3
metal 7.2
river 7.1
scenic 7

Microsoft
created on 2021-04-04

outdoor 90.9
black and white 90
vehicle 65
white 60.3
train 58.5

Feature analysis

Amazon

Person 98.6%
Train 74.2%

Captions

Microsoft

a train door 39.1%
an old photo of a train 37%

Text analysis

Amazon

1909
U. P.

Google

U. P. 1909
P.
1909
U.