Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (auction, New Carlisle, Ohio)

Date

July 30, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.587

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (auction, New Carlisle, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 30, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.587

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Chair 98.3
Furniture 98.3
Human 95.2
Person 95.2
Person 93.7
Face 92.5
Text 68.5
Photo 68.4
Photography 68.4
Piano 68.4
Leisure Activities 68.4
Musical Instrument 68.4
Table 67.5
Portrait 66
Sitting 65.6
Phone 64
Mobile Phone 64
Cell Phone 64
Electronics 64
Wood 63.4
Art 61.1
Drawing 58.7

Clarifai
created on 2018-03-23

people 99.5
furniture 98.7
one 98.3
no person 97.6
room 96.9
adult 96.8
seat 96.6
two 96.3
indoors 93.8
chair 92.9
wear 92.1
man 89.9
music 88.8
recreation 88.6
group 88.3
vehicle 88
home 85.8
piano 83.5
woman 83.2
leader 82.3

Imagga
created on 2018-03-23

chair 100
seat 79.9
support 51.8
device 38.9
barrow 24.8
handcart 24.7
wheeled vehicle 18.8
grand piano 18.3
business 17
piano 16.2
computer 13.6
outdoors 13.4
empty 12.9
keyboard instrument 12.6
percussion instrument 12.5
furniture 12.4
stringed instrument 12.3
office 12
people 11.7
wood 11.7
vehicle 11.3
work 11
bench 11
city 10.8
man 10.7
sitting 10.3
black 10.2
sky 10.2
laptop 10.1
relax 10.1
water 10
technology 9.6
outside 9.4
musical instrument 9.2
room 9.2
outdoor 9.2
modern 9.1
container 9
working 8.8
wooden 8.8
armchair 8.5
male 8.5
relaxation 8.4
summer 8.4
old 8.4
person 8.3
box 8.1
sun 8
metal 8
interior 8
nobody 7.8
travel 7.7
telephone 7.6
building 7.5
vacation 7.4
scenery 7.2
holiday 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

floor 92
indoor 89

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Male, 55%
Confused 45.1%
Surprised 45%
Happy 45%
Angry 45.1%
Disgusted 45%
Sad 45.1%
Calm 54.6%

Feature analysis

Amazon

Person
Piano
Mobile Phone
Person 95.2%

Captions

Anthropic Claude

Created by claude-3-haiku-20240307 on 2024-12-31

This black and white image shows a collection of framed photographs on a table or surface. The photographs appear to be portraits of two people, likely a man and a woman, though I cannot identify them specifically. The table or surface also holds some other objects, though they are not clearly visible in the image. The background is somewhat blurred, but appears to be a wall or other surface. The overall scene has a somber, reflective tone.

Created by claude-3-opus-20240229 on 2024-12-31

The black and white image shows two framed photographs sitting on a lace-covered table or shelf. The photographs appear to be portraits of a man and woman from an earlier time period, possibly the 1940s or 1950s based on their clothing and hairstyles. The portraits are displayed in simple, dark colored frames. The background has a soft, slightly unfocused look, giving the scene an aged, nostalgic quality.

Created by claude-3-5-sonnet-20241022 on 2024-12-31

This is a black and white photograph showing a vintage interior setting. On what appears to be a dark, worn table or bench with ornate turned legs, there are two framed photographs displayed. The table has what looks like a velvet or similar textured upholstery that shows signs of wear. The scene has a somewhat melancholic feel, with floral wallpaper visible in the background and what might be a curtain or window covering visible at the edge of the frame. The lighting creates strong shadows and the overall composition has a documentary-style quality typical of mid-20th century photography.