Human Generated Data

Title

Looking North from the Sherburne Bldg.

Date

April 1904

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.636

Human Generated Data

Title

Looking North from the Sherburne Bldg.

People

Artist: Unidentified Artist,

Date

April 1904

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.636

Machine Generated Data

Tags

Amazon
created on 2022-06-04

Building 93.9
Architecture 92
Nature 83.4
Outdoors 80.2
Urban 79.9
Road 78.4
Handrail 78
Banister 78
Tarmac 76.3
Asphalt 76.3
Intersection 75.6
Spire 75.4
Steeple 75.4
Tower 75.4
Person 69.8
Human 69.8
City 69.3
Town 69.3
Landscape 68.8
Office Building 65.1
Waterfront 64.9
Water 64.9
Pier 63.9
Dock 63.9
Port 63.9
Train 63.4
Transportation 63.4
Vehicle 63.4
Person 52

Imagga
created on 2022-06-04

city 33.3
ship 32.1
architecture 31.1
urban 25.4
river 24.9
water 23.4
structure 22.8
cityscape 22.7
building 22.3
sky 21.1
bridge 19.6
town 19.5
submarine 18.7
travel 18.3
vessel 18.1
tower 17.9
skyline 17.1
warship 16.2
landscape 15.6
construction 15.4
sea 14.9
liner 14.8
submersible 14.2
tourism 14
landmark 13.6
harbor 13.5
port 12.5
night 12.4
industry 12
passenger ship 11.9
old 11.9
boat 11.8
waterfront 11.7
panorama 11.4
buildings 11.4
famous 11.2
industrial 10.9
center 10.4
military vehicle 10.1
summer 9.7
downtown 9.6
scene 9.5
winter 9.4
ocean 9.4
craft 8.8
support 8.8
device 8.8
cloud 8.6
cold 8.6
traffic 8.6
pier 8.5
exterior 8.3
steel 8.2
reflection 8.1
transportation 8.1
boats 7.8
pollution 7.7
skyscraper 7.7
power 7.6
church 7.4
new 7.3
road 7.2
negative 7.2
crane 7.2
deck 7.1
modern 7
marina 7

Google
created on 2022-06-04

Microsoft
created on 2022-06-04

black and white 95.9
ship 87.2
text 79.8
building 68.4
city 50.7

Color Analysis

Feature analysis

Amazon

Person 69.8%
Train 63.4%

Captions

Microsoft
created on 2022-06-04

a large ship in the background 45.5%
a ship in the rain 40%
a boat in the rain 36.4%

Text analysis

Amazon

NAVIR
APR.1904.
dreamstime
written

Google

пррі904. Tit Seleas с. лimuf ON ONIYOO ATED NIAVN AUDI
NIAVN
пррі904
.
Tit
Seleas
с
лimuf
ON
ONIYOO
ATED
AUDI