Human Generated Data

Title

[Julia Feininger on train observation platform]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.212.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger on train observation platform]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.212.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Railing 92.9
Human 88.3
Person 80.1
Furniture 79.6
Handrail 79.2
Banister 79.2
Leisure Activities 72.8
Performer 71.7
Clothing 68.8
Apparel 68.8
Dance 55.1

Clarifai
created on 2019-11-19

people 99.1
adult 97.3
one 95.9
man 92
wear 89.7
woman 81.4
street 75.7
fence 75.4
portrait 71.8
leader 71.5
building 70.2
military 70
art 69.1
side view 68.6
outfit 67.3
old 66.9
profile 66.3
retro 66
two 65.4
graffiti 65.3

Imagga
created on 2019-11-19

statue 41.1
sculpture 24.3
negative 24.2
person 20.8
film 20
man 18.1
monument 17.7
sky 16.6
stone 15.4
performer 15.4
art 15.4
outdoor 15.3
photographic paper 14.7
history 13.4
picket fence 13.3
architecture 13.3
cemetery 12.9
lifestyle 12.3
people 12.3
adult 11.8
male 11.3
action 11.1
portrait 11
teenager 10.9
summer 10.9
marble 10.9
happy 10.6
dancer 10.5
fun 10.5
old 10.4
fence 10.4
style 10.4
culture 10.3
city 10
face 9.9
barrier 9.9
landmark 9.9
sport 9.9
religion 9.9
photographic equipment 9.8
one 9.7
outdoors 9.7
jumping 9.7
jump 9.6
entertainer 9.2
historic 9.2
freedom 9.1
active 9
sexy 8.8
body 8.8
black 8.6
historical 8.5
travel 8.4
attractive 8.4
leisure 8.3
fashion 8.3
human 8.2
dress 8.1
posing 8
building 8
hair 7.9
women 7.9
jeans 7.6
structure 7.6
memorial 7.6
park 7.5
joy 7.5
playing 7.3
activity 7.2

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

black and white 90.4
text 82.5
clothing 56.8
person 51.1

Color Analysis

Feature analysis

Amazon

Person 80.1%

Categories

Captions