Human Generated Data

Title

Untitled (Oakland)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5202

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Oakland)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5202

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Door 99.9
Human 99.5
Person 99.5
Person 99.5
Person 99
Folding Door 94.6
Sliding Door 87.8
Outdoors 79.1
Clothing 66
Apparel 66
Patio 59.3

Clarifai
created on 2019-11-15

people 99.7
street 99.4
monochrome 98.7
man 96.7
adult 95.7
woman 94.7
group together 93.5
black and white 92.7
group 92.6
wedding 91.5
road 87.7
city 86.9
child 85.1
shadow 84.4
two 84.2
administration 83.2
many 82.2
transportation system 81.9
analogue 81.1
one 80.5

Imagga
created on 2019-11-15

sidewalk 23.2
sliding door 22.5
city 21.6
door 21.1
silhouette 20.7
people 20.1
urban 19.2
man 18.1
dark 17.5
light 16.7
street 15.6
walking 15.2
walk 14.3
travel 14.1
sun 13.7
movable barrier 13.6
passenger 12.9
black 12.6
sunset 12.6
window 12.4
swing 11.9
world 11.8
transportation 11.7
adult 11.6
person 11
architecture 10.9
sky 10.8
tourism 10.7
couple 10.4
plaything 10.4
barrier 10.3
child 10.2
water 10
mechanical device 10
building 9.9
night 9.8
old 9.8
scene 9.5
evening 9.3
transport 9.1
outdoors 9.1
dawn 8.7
male 8.6
dusk 8.6
wall 8.5
business 8.5
beach 8.4
seller 8.4
landscape 8.2
road 8.1
men 7.7
journey 7.5
ocean 7.5
mechanism 7.4
pedestrian 7.4
alone 7.3
tourist 7.2
dirty 7.2
summer 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

street 99.4
text 98.1
black and white 94.1
clothing 88.9
person 85.5
city 85.5
tree 83.8
monochrome 82.6
footwear 66.7
people 65.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Female, 50.2%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Confused 49.5%
Fear 49.5%
Calm 49.5%
Sad 50.4%
Happy 49.5%

Feature analysis

Amazon

Person 99.5%

Categories