Human Generated Data

Title

Untitled (view of crates and fisherman on beach, Nazaré, Portugal)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.547.2

Human Generated Data

Title

Untitled (view of crates and fisherman on beach, Nazaré, Portugal)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99.3
Person 99.3
Person 98.9
Outdoors 78.5
Water 73.9
Nature 66.7
Portrait 61.4
Photography 61.4
Photo 61.4
Face 61.4
Building 55.8
Apparel 55.2
Clothing 55.2

Clarifai
created on 2019-08-09

people 99.7
adult 98.3
one 96.8
vehicle 95.4
watercraft 95.1
man 94.7
group together 93.1
monochrome 92.7
transportation system 91
woman 90.2
two 88.9
water 87.6
ocean 82.8
wear 82
sea 81.6
group 80.9
military 79.6
beach 78.7
child 75.4
athlete 75.2

Imagga
created on 2019-08-09

sea 57.9
ship 51.7
ocean 47.9
water 46
vessel 36.1
beach 35.5
boat 35.5
sunset 32.3
sky 30.7
travel 28.9
sun 24.7
coast 23.3
container ship 22
cargo ship 21
landscape 20.8
tourism 19.8
fishing 19.2
silhouette 19
harbor 18.3
summer 18
sand 18
reflection 17.9
sunrise 17.8
shore 17.7
waves 17.6
island 17.4
dusk 17.2
horizon 17.1
bay 16
coastline 16
transportation 14.3
evening 14
craft 13.6
shipping 13.1
fisherman 12.9
tourist 12.7
pier 12.5
vacation 12.3
holiday 12.2
shoreline 11.8
boats 11.6
port 11.6
light 11.4
wreck 11.2
cloud 11.2
yacht 11.1
transport 11
relax 10.9
tranquil 10.9
recreation 10.8
orange 10.7
seascape 10.5
marina 10.2
lake 10.1
people 10
dock 9.7
sail 9.7
outdoors 9.7
dawn 9.7
wave 9.5
seaside 9.4
man 9.4
shipwreck 9.2
calm 9.1
leisure 9.1
fish 9.1
landmark 9
nautical 8.7
scene 8.7
golden 8.6
clouds 8.5
black 8.4
city 8.3
morning 8.1
sunlight 8
clear 7.8
sailing 7.8
cruise 7.8
pacific 7.7
device 7.7
tropical 7.7
outdoor 7.6
famous 7.4
relaxing 7.3
scenery 7.2
line 7.2
romantic 7.1
day 7.1
wooden 7
scenic 7

Microsoft
created on 2019-08-09

text 99.2
black and white 92.3
person 81.5
water 72.5
clothing 50.1

Face analysis

Amazon

AWS Rekognition

Age 33-49
Gender Male, 54%
Angry 47.5%
Happy 45.2%
Surprised 50.2%
Disgusted 45.2%
Calm 45.8%
Fear 46%
Confused 45.1%
Sad 45.1%

Feature analysis

Amazon

Person 99.3%

Text analysis

Google

SVE
SVE