Human Generated Data

Title

Untitled (waterfront and docks, New Orleans)

Date

c. 1935

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22094

Human Generated Data

Title

Untitled (waterfront and docks, New Orleans)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22094

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 96.9
Human 96.9
Person 96.1
Person 95.7
Pedestrian 93.6
Person 93.5
Road 91.3
Person 90.8
Person 89.4
Water 87.7
Handrail 87.2
Banister 87.2
Tarmac 86.9
Asphalt 86.9
Person 85.8
Person 85.7
Waterfront 84.8
Person 77.8
Pier 77.8
Port 77.8
Dock 77.8
Outdoors 70.6
Person 69.4
Person 68
Person 66
Vehicle 64.8
Transportation 64.8
Person 64.2
Person 63.2
Person 62.3
People 62
Person 59.7
Intersection 59.6
Building 59.1
Nature 58.2
Railing 55.2

Clarifai
created on 2023-10-22

people 99.8
many 99.1
group 98.3
group together 97.5
vehicle 96.2
street 96
adult 94.5
city 93.9
monochrome 93.8
no person 93.5
man 93.3
transportation system 92.5
crowd 91.4
administration 89.2
railway 89
architecture 88
building 87.8
travel 86.6
light 84.3
theater 83.1

Imagga
created on 2022-03-11

city 57.4
architecture 42.4
cityscape 37.9
waterfront 37.1
night 35.5
skyline 35.2
urban 34.1
stage 31.6
building 30.7
buildings 28.4
landmark 28
travel 27.5
downtown 26
pier 25.3
tower 25.1
panorama 24.8
town 24.1
platform 24
river 23.1
tourism 22.3
sky 21.8
famous 21.4
skyscraper 21.2
marina 18.5
tourist 18.3
light 18.1
water 17.4
business district 16
structure 15.8
bridge 15.7
landscape 15.6
scene 15.6
harbor 15.4
support 14.7
ship 14.4
bay 14.4
panoramic 14.4
sea 14.1
old 13.9
lights 13.9
aerial 13.6
port 13.5
center 12.8
boats 12.6
liner 12.3
boat 12.2
church 12
street 12
device 12
modern 11.9
dark 11.7
dusk 11.5
illuminated 11.4
business 10.9
district 10.7
office 10.6
capital 10.4
monument 10.3
evening 10.3
construction 10.3
passenger ship 9.8
history 9.8
metropolitan 9.8
skyscrapers 9.8
reflection 9.8
tall 9.4
destination 9.4
ocean 9.3
new 8.9
culture 8.6
historic 8.3
vessel 7.9
dome 7.8
high 7.8
twilight 7.8
pacific 7.7
roof 7.7
england 7.6
clouds 7.6
track 7.5
square 7.2
horizon 7.2
sunset 7.2
coast 7.2
transportation 7.2
scenic 7

Microsoft
created on 2022-03-11

ship 98.9
text 98.7
black and white 96.7
outdoor 91.9
watercraft 87.4
boat 87.1
white 84.1
monochrome 84
vehicle 83
black 80.3
water 77.1
people 56.7
old 53
city 52.6
vintage 26.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 88.6%
Calm 94.6%
Sad 2.4%
Angry 1.8%
Happy 0.6%
Disgusted 0.2%
Fear 0.2%
Surprised 0.2%
Confused 0.1%

Feature analysis

Amazon

Person
Person 96.9%
Person 96.1%
Person 95.7%
Person 93.5%
Person 90.8%
Person 89.4%
Person 85.8%
Person 85.7%
Person 77.8%
Person 69.4%
Person 68%
Person 66%
Person 64.2%
Person 63.2%
Person 62.3%
Person 59.7%

Categories

Imagga

cars vehicles 99.7%

Text analysis

Amazon

OBELISM
OR R

Google

ORELIS:
ORELIS: