Human Generated Data

Title

Untitled (San Francisco)

Date

1949

People

Artist: Minor White, American 1908 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.31

Copyright

© The Trustees of Princeton University

Human Generated Data

Title

Untitled (San Francisco)

People

Artist: Minor White, American 1908 - 1976

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.31

Copyright

© The Trustees of Princeton University

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Wheel 98.7
Machine 98.7
Wheel 91.5
Wheel 87.7
Text 87.4
Wheel 76.5
Outdoors 64.5
Page 61.8
Vehicle 60.2
Transportation 60.2
Soil 59.9
Chair 59.5
Furniture 59.5

Clarifai
created on 2023-10-29

no person 99.7
people 97.4
one 96.5
retro 95.6
paper 95.3
dirty 94.8
wear 94.1
war 92.1
indoors 92
art 90.8
worn 89.6
administration 89.5
monochrome 88.7
antique 86.6
two 86.4
man 86.3
military 85.3
adult 84.9
architecture 84.5
empty 84.1

Imagga
created on 2022-01-30

file 46.2
furniture 41.3
office furniture 35.5
old 32.7
paper 25.9
safe 25.7
vintage 24
grunge 21.3
antique 20.3
texture 20.1
box 18.6
book jacket 18.5
strongbox 18
retro 18
jacket 17.3
empty 16.3
frame 15
ancient 14.7
book 14.7
wall 14.5
aged 14.5
covering 14.4
design 13.5
blank 12.8
art 12.6
container 12.6
binder 12.3
brown 11.8
wood 11.7
interior 11.5
office 11.2
wrapping 10.9
rough 10.9
business 10.9
space 10.9
decoration 10.8
textured 10.5
home 10.4
cover 10.2
letter 10.1
border 9.9
black 9.6
grungy 9.5
protective covering 9.4
document 9.3
house 9.2
dirty 9
color 8.9
door 8.8
locker 8.8
device 8.7
room 8.3
open 8.1
material 8
decor 7.9
indoors 7.9
text 7.9
modern 7.7
worn 7.6
word 7.5
close 7.4
refrigerator 7.3
new 7.3
religion 7.2
history 7.1
fastener 7.1

Google
created on 2022-01-30

Wheel 85.2
Tree 83.7
Plant 83.5
Door 78.7
Font 76.9
Tints and shades 76.7
Rectangle 75.2
Wood 72.3
Handwriting 68.6
Tire 66.9
Paper product 66.1
Room 65.2
Paper 65
History 61.9
Visual arts 61.3
Motor vehicle 60.6
Illustration 59.8
Publication 55.6
Chair 54.1
Art 53.1

Microsoft
created on 2022-01-30

text 99.6
indoor 92.9

Color Analysis

Feature analysis

Amazon

Wheel
Wheel 98.7%
Wheel 91.5%
Wheel 87.7%
Wheel 76.5%

Categories

Captions

Microsoft
created on 2022-01-30

a close up of a box 41%
close up of a box 34.9%
a close up of a door 34.8%

Text analysis

Amazon

have
the
apart,
at
where
that
for
When the
(Most
are
When
of
to
two
tones
least
small
42
two neighboring areas
areas
establish
tones of adjacent
neighboring
are for apart, to
adjacent
establish the tonal
a
picture.)
(Most photographs have at least a small section where the tones of
section
tonal
areas are unlike --
climax for that
photographs
unlike --
climax

Google

42 (Most photographe have at least a small section where the tones of two neighboring areas are fer apart, to establish the tonal climax" for that picture.) When the tones of ad jacent areas are unlike --
42
(Most
photographe
have
at
least
a
small
section
where
the
tones
of
two
neighboring
areas
are
fer
apart,
to
establish
tonal
climax"
for
that
picture.)
When
ad
jacent
unlike
--