Human Generated Data

Title

Untitled (New York City)

Date

c. 1930

People

Artist: Joseph Kaplan, American 1900 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.24

Human Generated Data

Title

Untitled (New York City)

People

Artist: Joseph Kaplan, American 1900 - 1980

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Landscape 97
Nature 97
Outdoors 97
Scenery 88.9
Bird 86.6
Animal 86.6
Person 86.4
Human 86.4
Water 86.2
Person 85
Bird 84.4
Person 83.9
Bird 83.8
Building 82.5
Person 76.9
Bird 75.9
Person 74.9
Aerial View 74.1
Person 73.7
Airplane 70
Transportation 70
Vehicle 70
Aircraft 70
Bird 68.8
Architecture 68.2
Person 66.6
River 65.7
Waterfront 65.6
Office Building 63.8
Text 58.5
Person 47.3
Person 43.1

Imagga
created on 2021-12-14

ship 61.9
liner 57
passenger ship 42.6
architecture 42.4
vessel 39.5
tourism 33.8
travel 33.8
warship 32.6
building 31.8
obelisk 30
rule 29.3
column 28.4
structure 28.2
sky 28.1
measuring instrument 27
city 26.6
historic 26.6
aircraft carrier 26.6
history 25.9
old 24.4
measuring stick 23.6
military vehicle 22.9
landmark 21.7
ancient 21.6
tower 21.5
stone 19.5
instrument 18.8
town 17.6
craft 17.2
historical 16.9
monument 16.8
church 16.7
tourist 16.3
sundial 15.5
wall 15.4
religion 15.2
vacation 14.7
culture 14.5
exterior 13.8
medieval 13.4
buildings 13.2
construction 12.8
timepiece 12.5
vehicle 12.4
destination 12.2
castle 11.7
battleship 11
house 11
landscape 10.4
famous 10.2
sea 10.2
holiday 10
traditional 10
cathedral 9.8
port 9.6
statue 9.5
water 9.3
window 9.2
outdoors 9
night 8.9
harbor 8.7
east 8.4
transportation 8.1
detail 8
temple 7.8
antique 7.8
sightseeing 7.8
st 7.8
arch 7.8
place 7.4
device 7.4
art 7.2
palace 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 92.8
ship 88.5

Feature analysis

Amazon

Bird 86.6%
Person 86.4%
Airplane 70%

Captions

Microsoft

a close up of a box 54.8%
close up of a box 48.2%
a box on a table 31.8%

Text analysis

Amazon

AMERICAN
AHMILAR
SHMILER