Human Generated Data

Title

Untitled (South Street pier, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4239

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (South Street pier, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4239

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clarifai
created on 2023-10-25

art 98.5
retro 98.2
vintage 98
movie 97.7
old 97.3
filmstrip 96.5
people 96.3
sepia 95.7
sepia pigment 94.9
illustration 94.7
antique 94.4
wear 94.1
man 92.2
picture frame 91.4
adult 90.6
dirty 90.4
slide 90.1
group 88.6
ancient 87.3
artistic 85.5

Imagga
created on 2022-01-08

hole 100
architecture 46.1
old 44.6
ancient 41.5
stone 32.1
building 31.5
brick 30.2
landmark 28.9
history 27.7
fortress 27.7
historic 26.6
tourism 26.4
travel 25.4
wall 24
city 22.5
famous 20.5
sky 19.1
antique 19.1
culture 18.8
tower 17.9
castle 17.4
monument 16.8
ruins 16.6
historical 16
ruin 15.6
construction 15.4
grunge 15.3
roman 14.9
vintage 14.9
structure 14.8
texture 14.6
tourist 14.5
town 13.9
religion 13.4
retro 13.1
archeology 12.8
brown 12.5
vacation 12.3
aged 11.8
rock 11.3
place 11.2
art 11.2
church 11.1
negative 11
building material 10.3
fort 9.8
film 9.8
minaret 9.7
pattern 9.6
column 9.4
window 9.2
material 9.1
landscape 8.9
textured 8.8
past 8.7
sand 8.6
stones 8.5
dwelling 8.5
clouds 8.5
frame 8.3
house 8.3
rough 8.2
dirty 8.1
detail 8
paper 7.8
arch 7.8
heritage 7.7
medieval 7.7
weathered 7.6
cliff dwelling 7.6
destination 7.5
temple 7.2
palace 7.1
day 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 92.7
indoor 85.5
drawing 64.3
decorated 34.9
stone 4.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-58
Gender Male, 98.3%
Calm 51%
Sad 34%
Angry 4.6%
Surprised 3%
Disgusted 2.4%
Fear 2.3%
Confused 1.9%
Happy 1%

Feature analysis

Amazon

Poster 96.9%

Categories

Imagga

paintings art 99.9%

Captions

Text analysis

Amazon

ONEHUE

Google

UNEHUE ALL
UNEHUE
ALL