Human Generated Data

Title

Untitled (Bogota)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5159

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Bogota)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.6
Human 99.6
Person 99.4
Person 99.2
Person 99
Person 98.7
Person 97.5
Soil 96.9
Nature 94.6
Outdoors 92.3
Archaeology 88.8
Rock 72.3
Clothing 71.9
Apparel 71.9
People 70.1
Land 60.8
Helmet 55.9
Hardhat 55.9
Person 42.3

Clarifai
created on 2019-11-15

people 99.6
group 97.7
man 97.5
adult 95
group together 93.8
many 89.5
military 86.5
child 86.4
wear 85.1
war 85
woman 84.6
street 84.6
recreation 82.5
several 79.3
soldier 78.4
illustration 76
print 74.9
art 74.2
two 73.8
one 72.4

Imagga
created on 2019-11-15

travel 23.2
landscape 23.1
rock 20.8
structure 20.8
balcony 20.3
architecture 19.8
thatch 19.5
tourist 19.2
sky 19.2
roof 18.1
stone 17.8
city 17.5
mountain 17.4
history 17
snow 16.8
tree 16.4
old 16
water 15.3
billboard 14.8
park 14.1
outdoor 13.8
tourism 13.2
summer 12.9
protective covering 12.2
signboard 12
river 11.6
scenic 11.4
winter 11.1
building 10.9
landmark 10.8
sand 10.8
covering 10.7
traveler 10.4
black 10.2
outdoors 10
person 10
vacation 9.8
urban 9.6
forest 9.6
season 9.4
mountains 9.3
house 9.2
wall 9.1
weather 9
scene 8.7
ancient 8.6
day 8.6
cold 8.6
road 8.1
sun 8
trees 8
rural 7.9
cloud 7.7
construction 7.7
beach 7.7
relaxation 7.5
famous 7.4
street 7.4
world 7.3
national 7.2
scenery 7.2

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 96.8
person 90.4
mountain 88
outdoor 86.8
people 74.2
hiking 73.1
tree 72.4
black and white 71.2
man 57
old 40.7
picture frame 16.4

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Female, 50%
Surprised 49.7%
Sad 49.6%
Confused 49.6%
Happy 49.5%
Disgusted 49.5%
Fear 49.9%
Angry 49.5%
Calm 49.6%

AWS Rekognition

Age 51-69
Gender Male, 54.1%
Surprised 45%
Calm 45.1%
Fear 45.3%
Disgusted 45%
Happy 45%
Angry 45.1%
Sad 54.4%
Confused 45%

AWS Rekognition

Age 16-28
Gender Female, 50.2%
Surprised 49.5%
Fear 49.5%
Happy 49.5%
Sad 50.4%
Calm 49.5%
Disgusted 49.5%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 23-35
Gender Male, 50.2%
Disgusted 49.5%
Confused 49.5%
Angry 49.5%
Sad 50.5%
Happy 49.5%
Surprised 49.5%
Calm 49.5%
Fear 49.5%

AWS Rekognition

Age 17-29
Gender Male, 50.4%
Happy 49.5%
Angry 50.2%
Confused 49.5%
Calm 49.6%
Disgusted 49.5%
Fear 49.7%
Surprised 49.5%
Sad 49.5%

AWS Rekognition

Age 33-49
Gender Male, 50.5%
Calm 49.7%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 50.3%
Confused 49.5%
Fear 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people standing in front of a mountain 83.8%
a group of people in front of a mountain 83.7%
a group of people sitting in front of a mountain 78.8%