Human Generated Data

Title

Untitled (waterfront and docks, New Orleans)

Date

c. 1935

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22091

Human Generated Data

Title

Untitled (waterfront and docks, New Orleans)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 98.7
Person 98.7
Building 93.9
Architecture 92.7
Asphalt 92.3
Tarmac 92.3
Road 92
Urban 91.5
Town 88.8
City 88.8
Downtown 85.8
Nature 77.7
Person 76.9
Intersection 76.6
Metropolis 74.5
Outdoors 71.4
Landscape 63.1
Water 59.6
Waterfront 59.6
Office Building 59.6
Tower 58.6
Spire 58.6
Steeple 58.6
Airport 56.4
Airfield 56.4

Imagga
created on 2022-03-11

track 100
submarine 36.2
city 34.9
submersible 29
travel 28.2
transportation 27.8
urban 25.4
bridge 25.2
road 23.5
architecture 23.5
sky 23
warship 22.4
traffic 21.9
water 21.4
ship 21.1
transport 20.1
landscape 19.4
night 18.7
sea 17.2
structure 17.1
car 17
river 16.9
building 16
ocean 15.8
military vehicle 15
station 14.5
cityscape 14.2
town 13.9
tower 13.4
vehicle 12.9
marina 12.5
train 12.5
scene 12.1
line 12
vessel 12
street 12
railway 11.8
highway 11.6
tourism 11.6
bay 11.3
industry 11.1
speed 11
sunset 10.8
harbor 10.6
journey 10.4
cloud 10.3
winter 10.2
waterfront 9.9
cars 9.8
pier 9.6
downtown 9.6
dusk 9.5
skyline 9.5
buildings 9.5
construction 9.4
clouds 9.3
beach 9.3
old 9.1
landmark 9
metal 8.9
scenic 8.8
boats 8.7
way 8.7
support 8.3
industrial 8.2
tourist 8.2
light 8
steel 8
railroad 7.9
summer 7.7
motion 7.7
outdoor 7.7
england 7.6
capital 7.6
electric 7.5
plant 7.5
lights 7.4
lake 7.3
reflection 7.3
sun 7.3
black 7.2
history 7.2

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

ship 96.6
text 90.9
black and white 88.9
outdoor 87
white 84
black 70.5
city 63.2
sky 50.2

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a train is parked on the side of a building 52.3%
a person sitting at a train station 46%
a train that is parked on the side of a building 44.5%