Human Generated Data

Title

Taking Up The Eel-Net

Date

1885

People

Artist: Peter Henry Emerson, British, English 1856 - 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Richard and Ronay Menschel, P2001.12

Human Generated Data

Title

Taking Up The Eel-Net

People

Artist: Peter Henry Emerson, British, English 1856 - 1936

Date

1885

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Richard and Ronay Menschel, P2001.12

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.3
Person 99.3
Person 99.1
Watercraft 96.7
Vessel 96.7
Vehicle 96.7
Transportation 96.7
Boat 84.2
Wood 74.3
Art 65.9
Photography 58.3
Photo 58.3
Ship 56.2

Clarifai
created on 2023-10-15

water 99.7
boat 99.6
lake 98.9
people 98.7
fisherman 98.6
river 98.5
canoe 98
watercraft 98
reflection 97.8
man 97.3
sea 96.2
beach 95.4
rowboat 95.4
sunset 94.2
two 93.4
woman 92.9
monochrome 92.5
travel 92.3
ocean 92.1
child 91.8

Imagga
created on 2021-12-14

ship 100
vessel 100
wreck 100
craft 71.8
water 43.4
sea 42.3
ocean 39.2
vehicle 36.3
boat 34.8
sky 29.4
shipwreck 27.8
beach 25.5
coast 25.2
travel 22.6
harbor 21.2
landscape 20.1
sunset 18.9
river 18.7
shore 17.7
tourism 17.3
coastline 16.9
port 16.4
fishing 16.4
waves 14.9
dock 14.6
bay 14.2
lake 13.8
boats 13.6
clouds 13.5
summer 13.5
sand 13.2
sunrise 13.1
transport 12.8
fisherman 12.5
silhouette 12.4
outdoor 12.2
outdoors 12
nautical 11.7
transportation 11.7
tourist 10.9
vacation 10.7
old 10.5
wave 10.4
cloud 10.3
pier 10.3
power 10.1
island 10.1
industrial 10
wharf 9.8
reflection 9.8
rock 9.6
sport 9.1
sailing 8.8
sail 8.8
surf 8.7
industry 8.5
relaxation 8.4
people 8.4
scenery 8.1
sun 8.1
tide 7.8
dawn 7.7
seascape 7.7
bridge 7.6
evening 7.5
architecture 7

Microsoft
created on 2021-12-14

water 96.9
lake 94.2
boat 92.6
indoor 91.3
watercraft 90.4
ship 90.1
text 89.7
gallery 84.8
person 81.4
vehicle 79.4
room 47.9
picture frame 11.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 9-19
Gender Male, 52.3%
Calm 58.8%
Sad 39.3%
Confused 1%
Happy 0.5%
Angry 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 32-48
Gender Female, 61.3%
Calm 89.5%
Sad 4.7%
Happy 2.2%
Fear 1.1%
Confused 1%
Angry 0.6%
Surprised 0.5%
Disgusted 0.2%

Feature analysis

Amazon

Person 99.3%
Boat 84.2%

Categories

Captions