Human Generated Data

Title

Untitled (San Diego)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5138

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (San Diego)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5138

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.9
Person 99.9
Person 98.6
Handrail 88.6
Banister 88.6
Outdoors 87
Garden 68
Building 67.4
Tree 63.5
Plant 63.5
Architecture 60.9
Porch 55.9
Person 45.6

Clarifai
created on 2019-11-15

people 99.7
monochrome 99.6
street 98.9
man 97.8
adult 97.4
woman 96.1
group together 95.7
black and white 95.2
girl 92.8
railway 91.3
two 91.2
one 91.2
locomotive 90.4
city 89.9
nature 89.8
vehicle 89.2
group 89.1
light 89
transportation system 88.8
portrait 88.3

Imagga
created on 2019-11-15

gymnastic apparatus 62.7
horizontal bar 61.5
sports equipment 47.3
barrier 44.5
obstruction 32.7
equipment 30.6
structure 23.3
people 22.3
outdoor 22.2
sport 20.5
outdoors 18.7
person 16.6
park 15.6
portrait 14.9
lifestyle 14.5
swing 14.3
fun 14.2
man 14.1
track 14.1
adult 13.6
black 13.2
action 13
active 12.6
summer 12.2
sky 12.1
male 12.1
outside 12
freedom 11.9
city 11.6
lady 11.4
happy 11.3
parallel bars 11.3
pretty 11.2
women 11.1
casual 11
leisure 10.8
activity 10.7
fashion 10.6
attractive 10.5
old 10.4
sexy 10.4
exercise 10
travel 9.9
human 9.7
boy 9.6
jeans 9.6
motion 9.4
mechanical device 9.4
life 9.4
cute 9.3
tree 9.2
vacation 9
recreation 9
healthy 8.8
urban 8.7
light 8.7
day 8.6
youth 8.5
plaything 8.4
teenager 8.2
posing 8
hair 7.9
forest 7.8
model 7.8
play 7.8
jumping 7.7
joy 7.5
fence 7.5
one 7.5
street 7.4
fitness 7.2
road 7.2
sunset 7.2
trees 7.1
cool 7.1
face 7.1
happiness 7.1
autumn 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 99.9
text 97.8
outdoor 90.7
black and white 76.1
person 74.9
clothing 74.5
window 18.7
picture frame 13.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 14-26
Gender Female, 51.3%
Disgusted 45.3%
Sad 52.7%
Calm 46.1%
Fear 45.3%
Happy 45.1%
Confused 45.3%
Angry 45.2%
Surprised 45.1%

Feature analysis

Amazon

Person 99.9%

Categories

Imagga

paintings art 86.4%
nature landscape 11.7%

Captions