Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

[Julia Feininger]

Date

1942-1943

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.575.24

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1942-1943

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.575.24

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Nature 88
Human 86.8
Person 86.8
Clothing 86.2
Apparel 86.2
Face 80.6
Outdoors 75.2
Finger 66.6
Smoke 59.2
Food 58.5
Meal 58.5
Coat 57.1
Overcoat 57.1
Photography 55.8
Photo 55.8

Clarifai
created on 2019-11-20

people 99.8
adult 98.4
man 98.1
monochrome 97.9
one 97.8
portrait 96.2
landscape 94.1
beach 90.3
wear 88.8
water 87.2
light 86.3
two 86.1
vehicle 86
side view 84.4
war 83.1
leader 81.1
sea 79.7
storm 79.6
administration 78
seashore 77.9

Imagga
created on 2019-11-20

sand 24.7
sky 22.4
clouds 20.3
black 19.4
light 17.4
dark 16.7
landscape 16.4
texture 16
grunge 14.5
earth 14.4
cloud 13.8
night 13.3
soil 12.7
dirty 12.7
space 12.4
sunset 11.7
old 11.1
beach 11.1
sea 10.9
sun 10.7
water 10.7
mountain 10.6
pattern 10.3
color 10
travel 9.9
ocean 9.6
art 9.2
wallpaper 9.2
laptop 9
scenery 9
horizon 9
rock 8.7
dramatic 8.7
close 8.6
grungy 8.5
film 8.5
outdoor 8.4
smoke 8.4
vintage 8.3
outdoors 8.2
backgrounds 8.1
surface 7.9
textured 7.9
wall 7.7
planet 7.7
dusk 7.6
portable computer 7.6
windshield 7.6
power 7.6
sunrise 7.5
silhouette 7.4
closeup 7.4
coast 7.2
device 7.1
reflection 7.1
summer 7.1
desert 7.1
season 7

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

person 95.1
text 91
monochrome 79.1
fog 76.3
black and white 75

Color Analysis

Feature analysis

Amazon

Person
Person 86.8%

Captions

Microsoft
created by unknown on 2019-11-20

an old photo of a man 36.5%
old photo of a man 31.6%
a man standing in front of a field 30.7%

Clarifai
created by general-english-image-caption-blip on 2025-05-03

a photograph of a man in a shirt and tie dye dye dye dye dye dye dye dye dye -100%

Google Gemini

Created by gemini-2.0-flash on 2025-05-11

Here is a description of the image:

The image appears to be an inverted black-and-white photograph. It shows what seems to be an elderly woman on the left side of the frame. She is wearing a patterned dress or shirt. It is difficult to discern details in the inverted image, but she appears to be holding something in front of her with both hands. The background seems to feature cloudy or textured patterns, possibly trees or foliage, but again, the inversion makes specific identification challenging. The overall tone is grainy and high in contrast, with prominent dark and light areas creating a stark visual.

Created by gemini-2.0-flash-lite on 2025-05-11

Here's a description of the image:

The image appears to be a vintage or manipulated photograph, likely a negative or a reversed color image. It depicts a person in profile, likely an older woman, standing outdoors. The figure is somewhat indistinct, but we can make out features. She is wearing a patterned blouse or shirt with a collar. She is also holding an object, possibly a bag or container. The background suggests a cloudy sky and possible foliage. The overall tone is dark and moody.