Human Generated Data

Title

Untitled (family on bicycle-powered raft on lake)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17359

Human Generated Data

Title

Untitled (family on bicycle-powered raft on lake)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Person 98.8
Water 96.6
Vehicle 93.1
Transportation 93.1
Vessel 92.7
Watercraft 92.7
Person 92.6
Waterfront 88
Outdoors 84.3
Boat 84.3
Rowboat 83.9
Person 73.4
Pier 73.1
Port 73.1
Dock 73.1
Photography 61
Photo 61
People 59.3
Canoe 55.1

Imagga
created on 2022-02-26

water 46.1
ocean 43.1
beach 37.3
paddle 36.2
sea 32.3
lake 32
oar 30.1
reflection 26
sunset 25.2
landscape 23.8
silhouette 23.2
sky 22.5
device 20.7
sun 20.5
sand 20.4
coast 19.8
outdoors 18.7
shore 18
river 17.4
pond 17.3
travel 16.9
outdoor 16.1
park 15.9
bird 15.9
scull 15.8
boat 15.5
horizon 15.3
sunrise 15
waves 14.9
recreation 14.4
dawn 13.6
clouds 13.5
summer 13.5
dusk 13.4
coastline 13.2
vacation 13.1
wave 13
fisherman 12.8
man 12.8
fishing 12.5
island 11.9
calm 11.9
outrigger 11.8
morning 11.8
scenic 11.4
male 11.4
blade 11.2
sport 11
tree 11
relax 11
scenery 10.8
holiday 10.8
wildlife 10.7
trees 10.7
pier 10.3
stabilizer 10
swim 9.6
black 9.6
light 9.4
evening 9.3
fly 9.3
tourism 9.1
active 9
people 8.9
birds 8.8
support 8.7
wild 8.7
natural 8.7
surf 8.7
cloud 8.6
seascape 8.6
rotating mechanism 8.6
pelican 8.5
tropical 8.5
bay 8.5
shoreline 8.5
relaxation 8.4
leisure 8.3
peaceful 8.2
tranquil 8.2
sunlight 8
outside 7.7
quiet 7.7
winter 7.7
trip 7.5
holidays 7.5
body of water 7.2
activity 7.2
wet 7.2
romantic 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.7
water 99.5
lake 96.7
outdoor 96.7
boat 87.9
watercraft 84.7
black and white 83
person 52.8

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 97.5%
Calm 99.3%
Sad 0.3%
Surprised 0.1%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%
Confused 0%

AWS Rekognition

Age 19-27
Gender Female, 89.8%
Calm 81.5%
Happy 12.8%
Sad 2.3%
Surprised 0.9%
Disgusted 0.7%
Fear 0.7%
Confused 0.6%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Boat 84.3%

Captions

Microsoft

a group of people on a boat in the water 85.2%
a group of people in a boat on the water 85.1%
a group of people in a boat in the water 85%

Text analysis

Amazon

NAOOY
DMS