Human Generated Data

Title

Untitled (North Calif.)

Date

1980

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5211

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (North Calif.)

People

Artist: Bill Dane, American born 1938

Date

1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5211

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 99.7
Person 99.7
Person 99.5
Wine 97.9
Alcohol 97.9
Beverage 97.9
Drink 97.9
Bottle 93.4
Liquor 73.3
Bar Counter 73.3
Pub 73.3
Glass 72.4
Home Decor 71.3
Wine Bottle 68.7
Linen 60.9
Tablecloth 59.3
Red Wine 58.1
Furniture 58.1
Table 58.1
Wine Glass 56.5

Clarifai
created on 2019-11-18

people 99.6
monochrome 99.1
group together 96.9
adult 96.2
group 96.1
street 95.9
man 95.7
woman 89.5
many 87
two 86.4
child 85.5
black and white 85.1
one 84.9
vehicle 83.8
war 80.4
chair 79.8
tree 79.6
boy 79.2
nature 79.2
furniture 78

Imagga
created on 2019-11-18

cemetery 31.3
bench 27.8
park 24.7
snow 21.1
park bench 20.1
old 19.5
wheeled vehicle 18.5
winter 17.9
man 16.8
landscape 15.6
tree 15
black 14.4
outdoors 14.2
city 14.1
vehicle 13.9
skateboard 13.9
building 13.8
cold 13.8
trees 13.3
street 12.9
love 12.6
seat 12.6
people 12.3
outdoor 12.2
fence 12.2
couple 12.2
male 12.1
garden 11.6
board 11.1
forest 10.4
structure 10.4
musical instrument 10.4
water 10
grunge 9.4
light 9.4
season 9.4
person 9.3
house 9.2
silhouette 9.1
conveyance 8.8
snowy 8.8
architecture 8.6
stone 8.5
sky 8.3
vintage 8.3
lake 8.2
memorial 8.2
life 8
picket fence 7.9
urban 7.9
day 7.8
scene 7.8
antique 7.8
frost 7.7
furniture 7.6
wood 7.5
monument 7.5
ice 7.4
adult 7.2
fall 7.2
art 7.2
portrait 7.1
night 7.1
together 7

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

tree 99.8
outdoor 97.9
text 84.8
black and white 84.7
person 83.2
grave 79
cemetery 68.5
clothing 60.7
old 57.4
funeral 54.1
man 52.2
posing 44.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Male, 51.8%
Happy 45%
Angry 45.1%
Disgusted 45%
Surprised 45%
Sad 54.6%
Fear 45.3%
Calm 45%
Confused 45%

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 97.8%

Captions

Microsoft
created on 2019-11-18

a vintage photo of a man 94.8%
a black and white photo of a man 92.2%
an old photo of a man 92.1%