Human Generated Data

Title

Untitled (group portrait of Boy Scouts and adults posing in front of cabins)

Date

1937

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4023

Human Generated Data

Title

Untitled (group portrait of Boy Scouts and adults posing in front of cabins)

People

Artist: Durette Studio, American 20th century

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 96.2
Human 96.2
Person 96.1
Person 95.9
Person 93.9
Nature 93.8
Person 93.5
Outdoors 93.2
Person 90.7
Land 87.9
Woodland 87.9
Tree 87.9
Forest 87.9
Vegetation 87.9
Plant 87.9
Person 80.8
Grove 77.5
Person 76.1
Person 74.1
Person 73.9
Person 69.5
Person 66.1
Person 65.4
Yard 65.4
Countryside 62.6
Fence 62.4
Snow 59.2
Person 56.7
Person 56
Person 46.7

Clarifai
created on 2019-06-01

monochrome 96.5
tree 96.5
cemetery 95.4
war 92.3
people 91.9
sepia 91.8
black and white 91.8
nature 91.5
outdoors 89.6
snow 88.6
winter 88.5
old 87.8
fall 86
no person 85.8
travel 85.3
funeral 84.8
building 84.3
tombstone 84.3
park 84.2
architecture 83.6

Imagga
created on 2019-06-01

fence 100
picket fence 100
barrier 91.1
obstruction 61.1
structure 36.2
landscape 30.5
snow 28.7
trees 24
tree 22.3
old 22.3
travel 20.4
sky 20.4
architecture 20.3
grunge 17.9
vintage 17.4
building 16.5
winter 16.2
tourism 14
antique 13.9
cemetery 13.4
history 12.5
rural 12.3
scenic 12.3
forest 12.2
weather 12
house 11.8
scenery 11.7
park 11.5
ancient 11.3
texture 11.1
beach 11
mountain 10.7
country 10.5
sun 10.5
rock 10.4
scene 10.4
resort 10.3
wood 10
city 10
road 9.9
retro 9.8
vacation 9.8
summer 9.7
construction 9.4
grain 9.2
art 9.1
aged 9.1
sand 8.9
light 8.7
artistic 8.7
cold 8.6
season 8.6
weathered 8.6
culture 8.6
historical 8.5
clouds 8.5
frame 8.3
outdoors 8.2
landmark 8.1
stone 7.9
grass 7.9
holiday 7.9
paper 7.8
outdoor 7.7
old fashioned 7.6
grungy 7.6
field 7.5
hill 7.5
ocean 7.5
style 7.4
land 7.4
brown 7.4
lake 7.3
peaceful 7.3
countryside 7.3
rough 7.3
religion 7.2
farm 7.1
sea 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

tree 99.4
outdoor 98.4
ground 98.1
black and white 77.9
cemetery 75.3
grave 75.3
dirt 20.4

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 50.5%
Happy 49.5%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Sad 49.6%
Calm 50.2%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50%
Surprised 49.6%
Sad 49.6%
Angry 49.7%
Disgusted 49.6%
Calm 49.9%
Happy 49.6%
Confused 49.6%

AWS Rekognition

Age 23-38
Gender Female, 50.3%
Calm 49.9%
Surprised 49.5%
Disgusted 49.5%
Happy 49.6%
Sad 49.8%
Confused 49.6%
Angry 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.4%
Confused 49.6%
Disgusted 49.6%
Calm 49.7%
Angry 49.6%
Sad 49.7%
Happy 49.7%
Surprised 49.6%

AWS Rekognition

Age 23-38
Gender Female, 50.2%
Happy 49.5%
Calm 49.6%
Angry 49.7%
Confused 49.6%
Surprised 49.6%
Sad 49.9%
Disgusted 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Surprised 49.6%
Confused 49.6%
Disgusted 49.6%
Happy 49.6%
Sad 49.9%
Calm 49.7%
Angry 49.6%

AWS Rekognition

Age 35-55
Gender Female, 50.1%
Disgusted 49.5%
Surprised 49.6%
Angry 49.6%
Confused 49.5%
Sad 49.8%
Calm 49.8%
Happy 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Happy 49.6%
Confused 49.6%
Angry 49.7%
Sad 49.6%
Calm 49.9%
Surprised 49.5%
Disgusted 49.7%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Calm 49.6%
Disgusted 49.7%
Sad 49.8%
Surprised 49.6%
Happy 49.6%
Confused 49.5%
Angry 49.7%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Happy 49.6%
Surprised 49.6%
Angry 49.6%
Confused 49.6%
Calm 49.8%
Sad 49.8%
Disgusted 49.6%

AWS Rekognition

Age 20-38
Gender Male, 50.3%
Surprised 49.6%
Sad 49.8%
Angry 49.6%
Disgusted 49.8%
Calm 49.6%
Happy 49.6%
Confused 49.6%

AWS Rekognition

Age 14-25
Gender Female, 50.4%
Angry 49.6%
Calm 49.6%
Sad 49.6%
Surprised 49.5%
Disgusted 50.1%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 12-22
Gender Male, 50.5%
Angry 49.6%
Sad 49.8%
Happy 49.5%
Confused 49.6%
Surprised 49.6%
Calm 49.6%
Disgusted 49.8%

AWS Rekognition

Age 20-38
Gender Male, 50.5%
Disgusted 49.7%
Calm 49.8%
Sad 49.6%
Confused 49.7%
Angry 49.6%
Surprised 49.6%
Happy 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Confused 49.6%
Disgusted 49.6%
Calm 49.6%
Angry 49.6%
Sad 49.7%
Happy 49.9%
Surprised 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Surprised 49.6%
Happy 49.8%
Disgusted 49.7%
Calm 49.8%
Sad 49.6%
Confused 49.5%
Angry 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.5%
Sad 49.6%
Angry 49.6%
Calm 50.1%
Confused 49.6%
Disgusted 49.6%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 17-27
Gender Male, 50.1%
Disgusted 49.5%
Surprised 49.5%
Angry 50%
Confused 49.6%
Sad 49.7%
Calm 49.6%
Happy 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Confused 49.6%
Happy 49.6%
Calm 49.7%
Disgusted 49.7%
Sad 49.6%
Angry 49.7%
Surprised 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Disgusted 49.5%
Calm 49.6%
Surprised 49.5%
Angry 49.6%
Happy 49.5%
Sad 50.3%
Confused 49.5%

AWS Rekognition

Age 15-25
Gender Male, 50.1%
Angry 49.6%
Happy 49.5%
Calm 50%
Surprised 49.6%
Sad 49.7%
Confused 49.7%
Disgusted 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Angry 49.6%
Disgusted 49.6%
Confused 49.6%
Calm 49.6%
Sad 49.8%
Surprised 49.6%
Happy 49.8%

Feature analysis

Amazon

Person 96.2%

Captions

Microsoft

a tree in front of a building 87.4%
a white horse in front of a building 81.3%
a horse in front of a building 81.2%