Human Generated Data

Title

Untitled (Bay Area)

Date

1980

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5225

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Bay Area)

People

Artist: Bill Dane, American born 1938

Date

1980

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.3
Person 99.3
Clothing 99.1
Apparel 99.1
Hat 98.9
Person 82.6
Text 66.3
Person 64.8
Overcoat 61
Coat 61
Worker 55.9
Sleeve 55.3

Clarifai
created on 2019-11-15

people 99.8
adult 97.3
one 97.3
man 95.7
veil 95.6
wear 94.5
two 93.9
lid 93.8
elderly 93
woman 92.2
group 89.5
merchant 88.3
street 86.8
commerce 85
newspaper 78.8
furniture 78.5
group together 77.8
administration 77.8
monochrome 77.5
vehicle 77

Imagga
created on 2019-11-15

shop 40
barbershop 38.8
man 30.9
hairdresser 30.6
mercantile establishment 27.3
people 26.8
newspaper 24.6
male 22.7
person 22
adult 20.8
business 19.4
place of business 18.2
product 17.4
men 17.2
lifestyle 16.6
work 16.5
worker 15.7
happy 15.7
job 15
city 15
indoors 14.9
urban 14.9
building 14.6
creation 14.4
home 14.4
working 14.1
portrait 13.6
office 13.6
casual 13.6
smile 13.5
professional 12.9
salon 12.5
fashion 12.1
attractive 11.9
businessman 11.5
modern 11.2
industry 11.1
two 11
occupation 11
equipment 11
room 10.9
interior 10.6
clothing 10.5
pretty 10.5
computer 10.4
black 10.2
phone 10.1
safety 10.1
smiling 10.1
inside 10.1
one 9.7
technology 9.6
chair 9.6
women 9.5
house 9.2
establishment 9.1
industrial 9.1
mall 8.8
brunette 8.7
helmet 8.7
customer 8.6
clothes 8.4
life 8
happiness 7.8
architecture 7.8
corporate 7.7
youth 7.7
communication 7.6
barber chair 7.4
20s 7.3
cheerful 7.3
alone 7.3
indoor 7.3
new 7.3
group 7.3
looking 7.2
box 7.2
center 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 99.9
person 99.3
black and white 95.3
street 93.4
man 93.4
newspaper 89.8
clothing 88.4
book 81.3
monochrome 74.1
old 63.2
piano 56
poster 50.4

Feature analysis

Amazon

Person 99.3%
Hat 98.9%

Captions

Microsoft

an old photo of a man 80.4%
old photo of a man 77.8%
a man holding a gun 55%

Text analysis

Amazon

MARSH
RUBY
CE
2EFRUIT
OF
OF U.SA
U.SA
CLEAN
TUMBLE

Google

FFRUIT
TUBCLEA CE OF USA MARSH RUBY FFRUIT
TUBCLEA
OF
RUBY
MARSH
CE
USA