Human Generated Data

Title

Untitled (L. A.)

Date

1982

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5229

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (L. A.)

People

Artist: Bill Dane, American born 1938

Date

1982

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.5
Human 99.5
Restaurant 81.1
Indoors 65.7
Food 64
Meal 64
Cafeteria 63.4
Home Decor 58.3
Shelf 57.2

Clarifai
created on 2019-11-15

people 99.5
room 97.9
adult 96.7
furniture 96.2
indoors 95.2
woman 94.5
window 94
monochrome 93.4
group 92.4
wear 88.1
man 87.9
one 86.4
two 85.4
chair 83
home 81.3
child 78.8
administration 78.5
table 78.4
family 78
desk 76.5

Imagga
created on 2019-11-15

room 38.6
classroom 30.5
business 29.2
man 28.9
people 26.2
businessman 24.7
office 23.7
male 22.8
person 22.4
adult 22
interior 19.5
indoors 19.3
laptop 19.1
building 18.8
computer 18.4
indoor 18.3
women 18.2
corporate 18
modern 17.5
businesswoman 16.4
meeting 16
smiling 15.2
work 14.9
sitting 14.6
lifestyle 14.5
professional 13.6
screen 13.3
working 13.3
executive 13.1
group 12.9
men 12.9
window 12.5
job 12.4
businesspeople 11.4
standing 11.3
kitchen 11.3
happy 11.3
communication 10.9
table 10.8
team 10.8
chair 10.7
urban 10.5
home 10.4
smile 10
suit 9.9
attractive 9.8
cheerful 9.8
looking 9.6
notebook 9.3
casual 9.3
student 9.3
teamwork 9.3
coffee 9.3
house 9.2
pretty 9.1
portrait 9.1
fashion 9
musical instrument 8.9
together 8.8
worker 8.6
technology 8.2
style 8.2
couple 7.8
window shade 7.8
covering 7.7
two 7.6
teacher 7.6
career 7.6
city 7.5
manager 7.4
holding 7.4
inside 7.4
black 7.3
alone 7.3
desk 7.2
color 7.2
windowsill 7.2
hall 7.2
life 7.2
happiness 7.1
architecture 7
glass 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

window 94.5
black and white 93.3
indoor 90.7
person 90.1
clothing 88.5
monochrome 85.5
furniture 80.3
woman 75.7
text 74.7
table 69.6

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a person standing in front of a window 94.3%
a person standing in front of a window 89.1%
a person standing next to a window 89%

Text analysis

Amazon

Baco
7
B2