Human Generated Data

Title

Untitled (Marine World)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5182

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Marine World)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5182

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.5
Human 99.5
Person 99.3
Apparel 97.4
Clothing 97.4
Outdoors 85.6
Coat 62.4
Garden 57.7

Clarifai
created on 2019-11-15

people 99.7
man 98.2
street 97.2
adult 96.1
monochrome 95
woman 93.7
child 93.6
group 92.9
two 90.3
couple 89.1
group together 88.4
boy 87.7
walk 86.8
recreation 85.8
girl 83
love 79.4
one 78.7
actor 78.2
shadow 78.2
portrait 76.2

Imagga
created on 2019-11-15

man 27.5
person 25
people 23.4
male 19.4
outdoors 18.6
silhouette 17.4
outdoor 16.8
park 16.5
adult 16.3
winter 16.2
snow 15.6
black 15.3
walking 13.3
shovel 12.8
human 12.7
clothing 12.7
one 11.9
landscape 11.9
alone 11.9
tree 11.5
walk 11.4
forest 11.3
cold 11.2
sport 10.8
sunset 10.8
device 10.5
megaphone 10.5
portrait 10.3
business 10.3
life 9.9
trees 9.8
sky 9.6
women 9.5
adventure 9.5
sitting 9.4
umbrella 9.2
scholar 9.1
mountain 8.9
businessman 8.8
couple 8.7
love 8.7
lonely 8.7
lifestyle 8.7
solitude 8.7
happiness 8.6
men 8.6
acoustic device 8.5
serene 8.5
lady 8.1
suit 8.1
active 8.1
sun 8
hat 8
autumn 7.9
together 7.9
work 7.8
travel 7.7
tool 7.7
hiking 7.7
old 7.7
hand tool 7.6
happy 7.5
fun 7.5
single 7.4
vacation 7.4
water 7.3
peace 7.3
intellectual 7.2
building 7.2
worker 7.2
backpack 7.2
activity 7.2
weather 7.1
covering 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 99.9
text 98.9
outdoor 98.6
clothing 96.6
person 91.1
footwear 89.9
black and white 73
street 70.5
man 66.4
posing 38.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 54.3%
Sad 45.2%
Angry 45.8%
Calm 53.6%
Happy 45%
Surprised 45.3%
Confused 45.1%
Disgusted 45%
Fear 45.1%

Feature analysis

Amazon

Person 99.5%

Captions