Human Generated Data

Title

Untitled (L. A.)

Date

1983

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5258

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (L. A.)

People

Artist: Bill Dane, American born 1938

Date

1983

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5258

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 97.8
Person 96.9
Person 94.5
Soil 92.4
Person 89.7
Nature 87.4
Outdoors 80.5
Painting 80.2
Art 80.2
Military 60.6
Army 60.6
Armored 60.6
Military Uniform 60.6
Archaeology 56.7

Clarifai
created on 2019-11-15

people 99.6
group 97.5
adult 95.3
man 93.8
canine 89.9
wear 89.3
dog 89.1
art 88
many 87.2
mammal 87.2
one 85.7
child 85.1
military 84.4
group together 83.4
interaction 83.2
two 83
old 81.6
illustration 81.1
leader 78.7
print 77.8

Imagga
created on 2019-11-15

beach 39
sand 29.5
water 28.7
ocean 25.9
sea 25.8
man 24.2
people 22.3
television 20.2
summer 19.3
vacation 18
relax 17.7
male 17.3
outdoor 15.3
outdoors 15.1
person 15.1
adult 14.9
sunset 14.4
telecommunication system 13.7
coast 13.5
sky 13.4
landscape 13.4
shore 13
fun 12.7
kin 12.3
walking 12.3
lifestyle 12.3
couple 12.2
leisure 11.6
silhouette 11.6
holiday 11.5
sun 11.3
travel 11.3
child 11.2
men 11.2
outside 11.1
tropical 11.1
women 10.3
love 10.3
family 9.8
walk 9.5
happy 9.4
sport 9.3
relaxation 9.2
lake 9.2
recreation 9
river 8.9
seashore 8.8
wave 8.6
happiness 8.6
two 8.5
black 8.4
portrait 8.4
attractive 8.4
alone 8.2
danger 8.2
dirty 8.1
active 8.1
color 7.8
sunny 7.7
adventure 7.6
tourist 7.6
coastline 7.5
free 7.5
waves 7.4
exercise 7.3
sexy 7.2
snow 7.1
sunlight 7.1
kid 7.1
country 7
season 7
together 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

drawing 96.2
sketch 96
person 93.3
text 90.1
gallery 88.1
old 87.3
man 83.8
window 81.4
clothing 78.9
room 78.3
posing 49.2
vintage 26.1
picture frame 20.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-61
Gender Male, 50.4%
Fear 45.1%
Calm 45.2%
Angry 45%
Happy 45%
Confused 45%
Disgusted 45%
Sad 54.6%
Surprised 45%

AWS Rekognition

Age 26-42
Gender Male, 53.1%
Fear 45.1%
Surprised 45%
Confused 45%
Sad 54.8%
Disgusted 45%
Happy 45%
Calm 45%
Angry 45%

Feature analysis

Amazon

Person 96.9%
Painting 80.2%