Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (Fillmore District, San Francisco)

Date

December 29, 1949

People

Artist: Minor White, American 1908 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.148

Copyright

© The Trustees of Princeton University

Human Generated Data

Title

Untitled (Fillmore District, San Francisco)

People

Artist: Minor White, American 1908 - 1976

Date

December 29, 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.148

Copyright

© The Trustees of Princeton University

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.5
Human 99.5
Person 99.3
Clothing 82.4
Apparel 82.4
Building 71.3
Home Decor 68.2
Door 67.4
Architecture 65.2
Floor 61.7
Door 50.8

Clarifai
created on 2023-10-28

no person 99.6
one 99.2
people 98.7
business 97.1
woman 97.1
man 96.9
two 96.6
adult 96.3
outdoors 96
architecture 95.2
wear 95
retro 94.8
travel 94.6
paper 91.3
indoors 90.2
graphic design 89.3
opportunity 87.7
portrait 85.8
banking 85.8
writing 85.7

Imagga
created on 2022-01-30

cash machine 100
machine 92.9
device 70.2
telephone 33.8
pay-phone 30.2
old 25.1
wall 23.1
call 20.4
electronic equipment 19.8
building 18.5
equipment 17.5
architecture 15.6
office 14.9
door 13.5
house 13.4
empty 12.9
computer 12.8
vintage 12.4
retro 12.3
home 12
street 12
television 11.9
frame 11.6
city 11.6
interior 11.5
window 11.2
business 10.9
urban 10.5
blank 10.3
one 9.7
people 9.5
town 9.3
room 9.2
wood 9.2
ancient 8.6
grunge 8.5
person 8.5
box 8.5
brick 8.5
travel 8.4
laptop 8.2
technology 8.2
working 7.9
design 7.9
smile 7.8
antique 7.8
outdoor 7.6
texture 7.6
pump 7.6
phone 7.4
dirty 7.2
black 7.2
open 7.2
transportation 7.2
work 7.1

Google
created on 2022-01-30

Rectangle 88.2
Sleeve 82.1
Font 81.6
Tints and shades 73.5
Room 66
Paper product 62.7
Visual arts 61.6
Art 59.3
Paper 57.1
Pattern 56
History 55
Handwriting 50.9
Illustration 50.6

Microsoft
created on 2022-01-30

text 98.6
clothing 69.5
person 63

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.9%
Calm 46.1%
Surprised 18.6%
Angry 17.6%
Disgusted 10.4%
Sad 2.6%
Confused 2.1%
Happy 1.7%
Fear 0.9%

Feature analysis

Amazon

Person
Door
Person 99.5%

Captions

Microsoft
created on 2022-01-30

text 64.7%

Text analysis

Amazon

at
If
they
two
the
in
all
146
They
moment
he
of
They meet in the mind of the spectator st the moment he
and
not
If they did, where would they meet?
did,
meet
that
mind
where
BERSTADT
spectator
st
photographs
would
meet?
and not at all in accordance with facts.)
with
HAL BERSTADT
1509
mean
HAL
facts.)
same.
RENT
1511
recognizes that these two photographs ultimately mean the same.
recognizes
accordance
these
FOR
ultimately
RUROOM FOR
507
RUROOM
ON
WEITEMAN ON
WEITEMAN

Google

146 and not at all in acoordance with facte.) If they did, where would they meet? They meet in the mind of the spectator at the moment he recogni zes that these two photographs ultimetely mean the same. HALBERSTADT 507 1509
146
and
not
at
all
in
acoordance
with
facte.)
If
they
did,
where
would
meet?
They
meet
the
mind
of
spectator
moment
he
recogni
zes
that
these
two
photographs
ultimetely
mean
same.
HALBERSTADT
507
1509