Human Generated Data

Title

[Julia Feininger, seen from behind]

Date

1933

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.247.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger, seen from behind]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1933

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.247.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 93.6
Person 93.6
Railway 83.5
Transportation 83.5
Rail 83.5
Train Track 83.5
Banister 78.8
Handrail 78.8
Tunnel 65.2
Dungeon 58.7
Silhouette 57.3
Pet 56.6
Cat 56.6
Mammal 56.6
Animal 56.6
Staircase 56.1

Clarifai
created on 2019-11-19

people 99.6
monochrome 98.2
adult 96.8
woman 96.4
portrait 95.8
shadow 95.6
man 95.4
dark 95.2
art 94.8
one 94.6
light 93.7
girl 93.3
nude 91.9
child 87.7
analogue 87.3
street 85.2
wear 84.6
music 84.2
silhouette 83.5
black and white 82.5

Imagga
created on 2019-11-19

step 34.4
device 29.5
support 29.2
silhouette 29
dark 23.4
black 21
sunset 19.8
water 19.3
light 17.4
man 17
people 16.7
person 15.9
sea 15.6
adult 15.5
beach 15.2
ocean 14.9
sun 14.8
body 14.4
sky 14
male 13.5
night 11.5
sport 11.5
evening 11.2
model 10.9
studio 10.6
human 10.5
wall 10.4
action 10.2
exercise 10
fitness 9.9
vacation 9.8
posing 9.8
old 9.8
one 9.7
portrait 9.7
couple 9.6
boy 9.6
architecture 9.5
hair 9.5
cadaver 9.4
shadow 9
wet 8.9
landscape 8.9
style 8.9
dance 8.9
sexy 8.8
walking 8.5
erotic 8.5
stone 8.4
summer 8.4
sand 8.3
city 8.3
alone 8.2
barrier 8.2
lifestyle 7.9
urban 7.9
motion 7.7
dancing 7.7
attractive 7.7
dusk 7.6
skin 7.6
happy 7.5
leisure 7.5
window 7.5
tourism 7.4
structure 7.3
building 7.2
coast 7.2
history 7.2
cool 7.1
love 7.1
cell 7.1
modern 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 95.9
cave 70.3
white 65.2
black and white 56
dark 55.3

Color Analysis

Feature analysis

Amazon

Person 93.6%
Cat 56.6%

Captions