Human Generated Data

Title

Port Jefferson

Date

1978

People

Artist: Larry White, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.1647

Human Generated Data

Title

Port Jefferson

People

Artist: Larry White, American 20th century

Date

1978

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Watercraft 99.8
Transportation 99.8
Vessel 99.8
Vehicle 99.8
Person 99.3
Human 99.3
Person 99
Boat 98.3
Wood 84
Water 66.5
Apparel 65.5
Clothing 65.5
Waterfront 58
Plywood 57.1
Cruiser 57.1
Military 57.1
Ship 57.1
Navy 57.1
Port 56.7
Pier 56.7
Dock 56.7

Imagga
created on 2022-01-09

boat 53.6
marina 49.6
water 44.1
ship 43
pier 42
sea 40.7
ocean 33.7
boats 33.1
rigging 32.4
yacht 31.4
vessel 31.2
travel 31
harbor 28.9
port 27
gear 26
sail 24.3
sky 23.7
bay 21.7
support 21.6
dock 21.4
device 21.2
sailboat 20.3
bridge 19.9
vacation 19.7
equipment 19.2
tourism 19
river 18.7
sailing 18.5
transport 17.4
mast 16.7
deck 16.1
nautical 15.6
transportation 15.3
summer 14.8
industry 14.5
sunset 14.4
coast 14.4
city 14.2
tourist 13.6
reflection 13
holiday 12.9
landscape 12.7
leisure 12.5
marine 12.4
landmark 11.8
luxury 11.2
beach 11.1
cloud 10.3
shore 10.2
vehicle 10.1
island 10.1
industrial 10
moored 9.9
tower 9.9
ships 9.8
cruise 9.7
craft 9.6
waterfront 9.6
architecture 9.4
clouds 9.3
end 9.2
steel 9.2
scenery 9
building 9
yachts 8.9
sun 8.9
wharf 8.8
helm 8.8
float 8.8
navigation 8.7
fishing 8.7
rope 8.5
destination 8.4
famous 8.4
sport 8.3
steering system 8
urban 7.9
scene 7.8
sunny 7.8
skyline 7.6
fisherman 7.6
outdoors 7.5
town 7.4
lake 7.3
calm 7.3
recreation 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

ship 99.7
sky 98.6
outdoor 98.4
watercraft 97.4
water 97.3
boat 97.1
text 94.1
black and white 82.9
vehicle 71.1

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 87.5%
Sad 69.6%
Confused 6%
Disgusted 5.3%
Happy 5%
Calm 4.7%
Angry 4.4%
Fear 4.3%
Surprised 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.3%
Boat 98.3%

Captions

Microsoft

a boat is docked next to a body of water 78.6%
a boat docked next to a body of water 78.5%
a large ship in a body of water 78.4%