Human Generated Data

Title

Untitled (Panama)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5161

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Panama)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.4
Human 99.4
Person 99.3
Outdoors 92.8
Tree 92.7
Plant 92.7
Garden 82.7
Apparel 78.1
Clothing 78.1
Skateboard 72.6
Sport 72.6
Sports 72.6
Arbour 71.6
Pedestrian 70.8
Asphalt 68.6
Tarmac 68.6
Arecaceae 68.5
Palm Tree 68.5
Person 66.5
People 64.7
Coat 62.6
Overcoat 62.6
Nature 62
Person 55.2

Clarifai
created on 2019-11-15

people 99.7
man 96.7
adult 96.6
street 96.5
woman 95.9
group 95.1
one 93.6
two 93
monochrome 91.8
child 89.3
tree 88.1
family 83.8
music 83.6
wear 83
city 83
silhouette 82.7
portrait 81.5
group together 79.8
shadow 79.6
art 79.1

Imagga
created on 2019-11-15

structure 32.7
snow 31.4
billboard 29.5
building 25.1
architecture 23.2
signboard 21.7
city 20.8
old 19.5
street 16.6
winter 16.2
tree 16.1
sky 15.9
trees 14.2
landscape 14.1
weather 13.8
university 13.6
stone 13.6
park 13.5
backboard 13.1
cold 12.9
history 12.5
travel 12
grunge 11.9
house 11.7
fountain 11.3
facade 11.2
season 10.9
religion 10.8
lamp 10.5
monument 10.3
black 10.2
light 10
frame 10
outdoor 9.9
tourism 9.9
fence 9.7
forest 9.6
antique 9.5
ancient 9.5
gas pump 9.5
brick 9.4
church 9.2
window 9.2
memorial 9.2
equipment 9.1
statue 8.4
sign 8.3
vintage 8.3
balcony 8.2
historic 8.2
outdoors 8.2
road 8.1
picket fence 7.9
design 7.9
art 7.9
day 7.8
lantern 7.8
scene 7.8
color 7.8
wall 7.7
frozen 7.6
pump 7.5
town 7.4
ice 7.4
exterior 7.4
landmark 7.2

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 100
text 97.7
street 97.3
black and white 95
monochrome 85.5
person 79.2
clothing 77.1
gallery 72.1
house 53
statue 51.8
picture frame 9.5

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Female, 53%
Calm 45.9%
Sad 45.4%
Happy 45%
Angry 53.1%
Fear 45.4%
Confused 45.1%
Surprised 45.1%
Disgusted 45%

AWS Rekognition

Age 29-45
Gender Female, 52.4%
Confused 45%
Angry 45%
Surprised 45%
Calm 45%
Disgusted 45%
Fear 45%
Sad 54.9%
Happy 45%

AWS Rekognition

Age 35-51
Gender Female, 50.1%
Happy 49.8%
Angry 49.5%
Disgusted 49.5%
Confused 49.7%
Sad 49.7%
Calm 49.7%
Surprised 49.6%
Fear 49.6%

Feature analysis

Amazon

Person 99.4%
Skateboard 72.6%

Captions

Microsoft

a person standing in front of a window 60.7%
a person that is standing in front of a window 58.4%
a person standing in front of a window 58.3%