Human Generated Data

Title

Untitled (group on river bank watching diver, missing child)

Date

c.1970, from 1960 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18754

Human Generated Data

Title

Untitled (group on river bank watching diver, missing child)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, from 1960 negative

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 98.3
Person 98.3
Person 95.8
Person 95.7
Person 95.6
Person 95.3
Person 92.7
Person 90.8
Person 88.3
Person 88.3
Person 88.2
Person 87.3
Person 86.1
Person 85.3
Outdoors 83.8
Person 83.5
Person 80.5
Person 79.1
Apparel 78.6
Clothing 78.6
Person 77.9
Person 77.5
Tree 76
Plant 76
Nature 76
People 74.8
Water 74.3
Art 73.5
Vegetation 71.4
Person 68.9
Person 63.8
Land 60.4
Military 59.3
Military Uniform 59.3
Crowd 59.3
Army 56.2
Armored 56.2
Coat 56.1
Pedestrian 55.8
Female 55

Imagga
created on 2022-03-05

structure 45.3
fountain 39.9
landscape 35
river 29.6
water 27.4
park 26.4
travel 24
tourism 23.9
rock 23.5
forest 22.6
tree 22.4
stone 19.6
trees 16.9
scenic 16.7
sky 16.6
scenery 16.2
old 16
waterfall 15.6
mountain 15.1
natural 14.7
lake 14.7
national 14.5
fall 14.5
wall 14
dam 13.5
barrier 13.3
rural 13.2
environment 13.2
canyon 12.6
land 12.5
autumn 12.3
summer 12.2
mountains 12
wet 11.6
bridge 11.4
light 11.4
ancient 11.2
grass 11.1
architecture 11.1
falls 10.8
country 10.5
sun 10.5
outdoors 10.5
building 10.3
swamp 10.3
flow 10.2
peaceful 10.1
colorful 10
tourist 10
outdoor 9.9
landmark 9.9
geology 9.7
vista 9.7
obstruction 9.7
woods 9.6
scene 9.5
rocks 9.4
season 9.4
clouds 9.3
countryside 9.1
road 9
texture 9
vacation 9
colors 8.8
erosion 8.8
leaves 8.8
stream 8.7
branches 8.7
hiking 8.7
wetland 8.3
bright 7.9
sand 7.9
spring 7.9
wild 7.8
high 7.8
orange 7.7
valley 7.7
grunge 7.7
serene 7.5
pattern 7.5
dark 7.5
wood 7.5
earth 7.3
cave 7.3
tranquil 7.2
fence 7.2
black 7.2
snow 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 94.6
text 92.7
grave 90.8
cemetery 86.3
tree 78.5
nature 75.6
black and white 57.2

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Female, 60.9%
Calm 89.2%
Happy 3.6%
Fear 2.9%
Surprised 2.4%
Sad 0.8%
Disgusted 0.5%
Angry 0.4%
Confused 0.3%

AWS Rekognition

Age 25-35
Gender Male, 99.1%
Surprised 96.2%
Happy 3.3%
Calm 0.2%
Angry 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0%
Confused 0%

AWS Rekognition

Age 20-28
Gender Female, 95.1%
Calm 39.9%
Happy 39.4%
Disgusted 8.3%
Sad 3.5%
Fear 3.3%
Angry 2.2%
Surprised 2.2%
Confused 1.2%

AWS Rekognition

Age 24-34
Gender Female, 87.5%
Happy 77.2%
Calm 21.4%
Sad 0.8%
Surprised 0.2%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 21-29
Gender Male, 92.1%
Calm 58.6%
Angry 13.3%
Sad 9%
Fear 4.4%
Surprised 4%
Confused 3.7%
Disgusted 3.5%
Happy 3.4%

AWS Rekognition

Age 22-30
Gender Male, 98.8%
Happy 92.9%
Calm 5%
Fear 0.6%
Confused 0.5%
Disgusted 0.4%
Sad 0.2%
Angry 0.2%
Surprised 0.2%

AWS Rekognition

Age 30-40
Gender Male, 98.2%
Calm 70.3%
Happy 26.6%
Sad 0.9%
Angry 0.7%
Confused 0.5%
Surprised 0.4%
Fear 0.3%
Disgusted 0.3%

AWS Rekognition

Age 23-33
Gender Female, 82.6%
Calm 76.2%
Happy 20%
Sad 2.1%
Angry 0.5%
Surprised 0.3%
Fear 0.3%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 30-40
Gender Male, 92.7%
Happy 61.8%
Angry 24.5%
Calm 5.4%
Sad 4%
Disgusted 1.6%
Confused 1.2%
Surprised 0.9%
Fear 0.6%

Feature analysis

Amazon

Person 98.3%

Captions

Microsoft

a group of people standing next to a waterfall 29.2%
a person standing next to a waterfall 27.5%