Human Generated Data

Title

[Elevated train tracks with cars underneath]

Date

late 1930's-1940's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.517.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Elevated train tracks with cars underneath]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

late 1930's-1940's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Landscape 99.1
Nature 99.1
Outdoors 99.1
Scenery 97
Person 92.2
Human 92.2
Building 87.8
Person 82.2
Aerial View 81.2
Person 77
Architecture 73.2
Road 72.8
Urban 68.3
Housing 65.2
Vehicle 64.2
Airplane 64.2
Transportation 64.2
Aircraft 64.2
Boat 62.5
City 61.1
Town 61.1

Clarifai
created on 2019-11-19

people 98.8
no person 98.5
vehicle 96.7
group 96.1
many 95.5
war 94.8
transportation system 93.2
monochrome 92.7
military 91.8
group together 91.7
calamity 88.9
adult 88.6
architecture 88.6
street 86.4
one 86.1
home 85
wear 84.4
administration 84.2
watercraft 84
building 82.4

Imagga
created on 2019-11-19

city 35.8
travel 26.8
architecture 26.7
old 25.1
building 23.9
town 20.4
structure 19.4
landmark 18.1
buildings 17
map 16.6
landscape 16.4
urban 15.7
mine 15.4
cityscape 15.1
intersection 15.1
tourism 14.9
aerial 14.6
grunge 14.5
antique 13.9
river 13.3
ship 13.1
exterior 12.9
street 12.9
vintage 12.6
ancient 12.1
famous 12.1
water 12
historic 11.9
wall 11.7
excavation 11.5
sky 11.5
scenic 11.4
representation 11.3
texture 11.1
church 11.1
aged 10.9
tower 10.7
village 10.6
stone 10.4
sea 10.4
culture 10.3
world 10
vacation 9.8
roof 9.5
scene 9.5
construction 9.4
house 9.2
tourist 9.1
scenery 9
history 8.9
pattern 8.9
puzzle 8.8
panorama 8.6
capital 8.5
grungy 8.5
historical 8.5
hill 8.4
paper 7.9
vessel 7.9
houses 7.8
above 7.7
monument 7.5
retro 7.4
design 7.3
center 7.2
coast 7.2
home 7.2
religion 7.2
jigsaw puzzle 7.1
summer 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

ship 93.2
text 90.3
black and white 83
white 61.2
old 47.2

Feature analysis

Amazon

Person 92.2%
Airplane 64.2%
Boat 62.5%

Captions

Microsoft

an old photo of a ship 38.2%
a black and white photo of a ship 32.3%
an old photo of a truck 32.2%

Text analysis

Amazon

GIGARS

Google

UNTED
ciGARS
UNTED ciGARS