Human Generated Data

Title

[Train]

Date

Late 1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.566.33

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Train]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

Late 1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.566.33

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Person 98.5
Human 98.5
Transportation 93.8
Train 93.8
Vehicle 93.8
Nature 83.8
Outdoors 81.9
Weather 77
People 75.2
Photography 55.5
Photo 55.5

Clarifai
created on 2019-11-20

monochrome 99
people 98.6
man 96.1
adult 95.1
one 93.4
street 90.5
light 90.2
woman 88.4
child 86.9
black and white 85.8
two 85.1
portrait 85
art 84.8
sky 83.7
outdoors 83.1
travel 82.9
girl 82.9
nature 82.7
landscape 82.3
city 82

Imagga
created on 2019-11-20

wreckage 36.6
part 29.1
sand 24.1
sky 23.2
fountain 22.2
old 21.6
structure 19.3
landscape 17.8
ship 17.7
water 17.3
stone 16.5
vessel 16.2
shipwreck 16
texture 14.6
sea 14.1
travel 13.4
cloud 12.9
vehicle 12.5
ancient 12.1
dirty 11.7
dark 11.7
sunset 11.7
ocean 11.6
tank 11.4
soil 11.4
cemetery 11.1
grunge 11.1
environment 10.7
river 10.7
vacation 10.6
rock 10.4
sunrise 10.3
danger 10
tree 10
earth 9.8
beach 9.4
light 9.4
desert 9.3
black 9
coast 9
crater 9
clouds 8.5
power 8.4
park 8.4
vintage 8.3
pattern 8.2
scenery 8.1
sun 8
architecture 7.9
textured 7.9
forest 7.8
wall 7.8
outdoor 7.6
serene 7.5
smoke 7.4
tourism 7.4
natural 7.4
industrial 7.3
natural depression 7.2
holiday 7.2
material 7.1
mountain 7.1

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

text 98.2
person 89.7
outdoor 87.1
black and white 72.5

Color Analysis

Feature analysis

Amazon

Person 98.5%
Train 93.8%

Captions