Human Generated Data

Title

Untitled (search for drowning victim)

Date

1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18862

Human Generated Data

Title

Untitled (search for drowning victim)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 99.3
Person 99.3
Person 99
Person 97.9
Person 96.7
Watercraft 95.4
Vessel 95.4
Vehicle 95.4
Transportation 95.4
Boat 95.2
Person 95.1
Person 94.5
Person 94.4
Rowboat 93.3
Person 93
Person 92.5
Person 89.5
Outdoors 86.5
Person 84.9
Person 82.2
Water 76.8
Person 75.3
Person 74
Person 71.7
Person 67.2
Person 64.8
Nature 63.3
Canoe 55.5
Person 44.5

Imagga
created on 2022-03-05

boat 52.3
water 50.1
fountain 42
structure 32.5
river 29.4
sea 25.8
gondola 23.4
travel 22.5
lake 21.3
fishing 21.2
ocean 20.7
tourism 20.6
village 19.4
landscape 19.3
boats 18.4
vessel 17.3
city 15.8
architecture 15.8
vacation 15.5
building 15
summer 14.8
canal 14.8
ship 14.3
sky 14
reflection 13.9
outdoors 13.6
park 13.6
pond 13.5
old 13.2
house 12.9
tree 12.7
port 12.5
trees 12.5
town 12.1
pier 12
transport 11.9
shore 11
tourist 10.9
coast 10.8
dock 10.7
people 10.6
harbor 10.6
fisherman 10.5
fish 10.1
scenic 9.7
holiday 9.3
transportation 9
wet 8.9
sunny 8.6
cityscape 8.5
destination 8.4
sport 8.4
traditional 8.3
historic 8.3
island 8.2
tranquil 8.1
landmark 8.1
channel 8
shoreline 7.9
outdoor 7.6
paddle 7.6
leisure 7.5
famous 7.4
craft 7.3
calm 7.3
sun 7.2
recreation 7.2
colorful 7.2
day 7.1

Microsoft
created on 2022-03-05

outdoor 98.6
text 97.4
water 91.6
ship 86.3
watercraft 83.9
black and white 67.1
white 63.6
old 56.3
boat 18.3

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 97.5%
Happy 63.3%
Calm 21.5%
Sad 5.9%
Angry 5.9%
Fear 1.3%
Surprised 1%
Disgusted 0.8%
Confused 0.3%

AWS Rekognition

Age 26-36
Gender Female, 56.1%
Calm 42.1%
Happy 25%
Sad 23%
Disgusted 3.7%
Angry 2.5%
Confused 1.5%
Surprised 1.3%
Fear 0.8%

AWS Rekognition

Age 18-24
Gender Female, 91.3%
Happy 83.7%
Fear 10.5%
Surprised 2.4%
Sad 1.1%
Calm 0.7%
Confused 0.6%
Angry 0.5%
Disgusted 0.5%

AWS Rekognition

Age 39-47
Gender Male, 87.7%
Sad 78.6%
Calm 13.9%
Confused 3.7%
Happy 1.1%
Disgusted 0.8%
Fear 0.7%
Angry 0.6%
Surprised 0.5%

AWS Rekognition

Age 28-38
Gender Male, 83.8%
Calm 92.1%
Happy 3.1%
Surprised 1.5%
Fear 1%
Sad 0.8%
Disgusted 0.7%
Confused 0.5%
Angry 0.4%

AWS Rekognition

Age 39-47
Gender Female, 96.1%
Happy 96.9%
Calm 2%
Sad 0.5%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Confused 0.1%
Surprised 0%

Feature analysis

Amazon

Person 99.4%
Boat 95.2%

Captions

Microsoft

a group of people standing next to a body of water 95.5%
a group of people standing in a body of water 94.3%
a group of people standing in front of a body of water 94.2%

Text analysis

Amazon

M.M.117
M.M.117 YT33AS
YT33AS