Human Generated Data

Title

[Train]

Date

1950's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.559.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Train]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Nature 94.7
Outdoors 91.3
Building 88.6
Human 85.8
Person 85.8
Ground 82.6
Bunker 80.1
Weather 65.6
Meal 65.1
Food 65.1
Tower 64.3
Architecture 64.3
Housing 64.1
Wall 63
Spire 60.3
Steeple 60.3
People 60.3
Plant 59.1
Tree 59.1
Urban 57.3

Clarifai
created on 2019-11-19

people 98.9
war 95.6
military 95.3
calamity 91.5
no person 91.2
soldier 89
monochrome 87.9
adult 87.8
waste 87.8
street 87.7
vehicle 87.5
group together 85.4
building 84.3
road 83.8
abandoned 80.9
wear 80.6
group 80.3
transportation system 79.2
home 78.9
man 78.5

Imagga
created on 2019-11-19

fortress 64.9
sky 39
landscape 31.3
wall 30.8
brick 28.8
structure 26.4
building 26.3
travel 25.4
tourism 22.3
architecture 22.2
old 21.6
building material 21.1
stone 20.9
tower 19.8
history 19.7
hut 19.7
desert 18.7
sand 16.9
castle 16.6
rock 16.5
mountain 16.4
clouds 16.1
ancient 14.7
construction 14.6
shelter 14.2
hill 13.1
ruins 12.7
landmark 12.7
town 12.1
coast 10.8
vacation 10.7
house 10.4
fort 9.8
medieval 9.6
ascent 9.6
cloud 9.5
grave 9.5
beach 9.3
snow 9.2
historic 9.2
industrial 9.1
environment 9.1
scenery 9
outdoors 9
water 8.7
high 8.7
stones 8.5
winter 8.5
electricity 8.5
outdoor 8.4
famous 8.4
summer 8.4
trees 8
rural 7.9
sea 7.8
sunny 7.8
canyon 7.8
industry 7.7
rampart 7.6
energy 7.6
power 7.6
rocks 7.5
wind 7.5
tourist 7.5
city 7.5
place 7.5
park 7.4
mountains 7.4
metal 7.3
road 7.2
grass 7.1
scenic 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

outdoor 96
black and white 78.2
text 76.4
white 66.8
black 66

Feature analysis

Amazon

Person 85.8%

Captions

Microsoft

a black and white photo of a building 66.8%
a black and white photo of a person 54%
a black and white photo of a field 53.9%

Text analysis

Google

HAVEN
HAVEN