Human Generated Data

Title

Untitled (men at funeral in open space, Albuquerque, New Mexico)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.359

Human Generated Data

Title

Untitled (men at funeral in open space, Albuquerque, New Mexico)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.5
Person 99.5
Person 99.3
Nature 99.2
Outdoors 99
Person 98.9
Person 98.3
Person 97.4
Person 97.1
Ground 94.6
Soil 93.8
Person 93.6
Person 88
Countryside 88
Person 85.5
Person 81.8
Person 80.8
Person 79.7
Person 78.7
Person 78.2
Rural 77.6
Tent 60
Sand 59.8
People 58.8
Shelter 57.4
Building 57.4
Field 55.8
Road 55.7
Hut 55.2

Imagga
created on 2022-01-23

shovel 33.7
sky 31.3
landscape 28.3
tool 26.6
hand tool 25.3
beach 22.1
tree 22
sea 18
cemetery 18
field 17.6
grass 17.4
scenery 16.2
sand 16.2
rural 15
summer 14.8
travel 14.8
sunset 14.4
clouds 14.4
fence 14.2
outdoors 13.6
farm 13.4
countryside 12.8
water 12.7
ocean 12.4
agriculture 12.3
cloud 12.1
horizon 10.8
outdoor 10.7
pole 10.7
weapon 10.7
sun 10.5
palm 10.3
day 10.2
people 10
tourism 9.9
vacation 9.8
sunny 9.5
outside 9.4
island 9.2
park 9.1
old 9.1
meadow 9
mountain 8.9
trees 8.9
picket fence 8.9
country 8.8
scenic 8.8
missile 8.7
scene 8.7
land 8.5
tropical 8.5
sunrise 8.4
relax 8.4
leisure 8.3
environment 8.2
coast 8.1
river 8
holiday 7.9
fishing 7.7
desert 7.6
paradise 7.5
coastline 7.5
wind 7.5
evening 7.5
silhouette 7.4
peaceful 7.3
spring 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

tree 99.3
sky 99.3
outdoor 99.1
person 92.2
text 87.4

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 99.4%
Happy 69.6%
Angry 19.2%
Confused 4.3%
Surprised 2.4%
Sad 1.7%
Calm 1.4%
Disgusted 1%
Fear 0.4%

AWS Rekognition

Age 21-29
Gender Male, 98.8%
Sad 81.4%
Fear 7.2%
Angry 3.4%
Disgusted 3.4%
Calm 2.4%
Happy 0.8%
Confused 0.8%
Surprised 0.5%

AWS Rekognition

Age 29-39
Gender Male, 98.5%
Sad 56.8%
Angry 24.8%
Confused 8.4%
Calm 5.1%
Disgusted 1.8%
Happy 1.3%
Surprised 1.1%
Fear 0.7%

AWS Rekognition

Age 14-22
Gender Male, 98.9%
Calm 48.6%
Sad 36%
Surprised 7.6%
Confused 3.1%
Angry 1.6%
Disgusted 1.3%
Fear 1%
Happy 0.9%

AWS Rekognition

Age 11-19
Gender Male, 99%
Calm 48.3%
Sad 27.8%
Disgusted 6.3%
Surprised 5%
Confused 4.8%
Fear 4.5%
Angry 1.8%
Happy 1.6%

Feature analysis

Amazon

Person 99.5%
Tent 60%

Captions

Microsoft

a group of giraffe standing next to a palm tree 62.5%
a herd of giraffe standing next to a palm tree 59.9%
a group of people walking down a dirt road 59.8%