Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (Montgomery Street, San Francisco)

Date

July 21, 1950

People

Artist: Minor White, American 1908 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.201

Copyright

© The Trustees of Princeton University

Human Generated Data

Title

Untitled (Montgomery Street, San Francisco)

People

Artist: Minor White, American 1908 - 1976

Date

July 21, 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.201

Copyright

© The Trustees of Princeton University

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.8
Human 99.8
Person 99.4
Clothing 99.2
Apparel 99.2
Collage 99.2
Poster 99.2
Advertisement 99.2
Person 97.4
Person 97.4
Face 82.3
Suit 81.1
Overcoat 81.1
Coat 81.1
Female 59.7
Man 58.4
Shirt 57.9
Bag 57.2
Head 56.5
Flyer 56.4
Paper 56.4
Brochure 56.4
Person 53.5

Clarifai
created on 2023-10-27

people 99.9
two 99.1
adult 98.2
wear 98
woman 97.9
man 97.7
street 97.5
one 96.4
portrait 95.2
doorway 95.1
group 95.1
administration 92.9
leader 92.6
room 91.3
facial expression 89.8
child 89.7
three 89.6
monochrome 89.5
outerwear 88.9
actress 84.5

Imagga
created on 2022-01-30

locker 32.1
telephone 28.1
fastener 26.5
pay-phone 26
wall 23.9
device 23.8
old 23.7
architecture 21.1
door 20.1
restraint 19.9
window 19.6
call 19.4
building 18.6
electronic equipment 18.4
house 17.5
cell 17.2
equipment 16.9
vintage 14.9
home 12.8
city 12.5
interior 11.5
ancient 11.2
medicine chest 11.1
texture 11.1
art 10.4
brick 10.4
structure 10.2
shop 10
religion 9.9
history 9.8
furniture 9.8
room 9.7
product 9.6
pattern 9.6
glass 9.3
town 9.3
historic 9.2
sign 9
cabinet 8.9
metal 8.9
urban 8.7
antique 8.7
business 8.5
religious 8.4
elevator 8.2
retro 8.2
aged 8.1
design 7.9
black 7.8
windows 7.7
newspaper 7.5
frame 7.5
close 7.4
symbol 7.4
security 7.3
rough 7.3
paper 7.1
travel 7
indoors 7
facade 7

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

clothing 98.7
person 98.1
text 97.3
indoor 91.9
man 91.8
poster 87.2
drawing 82
gallery 73.6
human face 64.8
room 55

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 99.9%
Calm 49%
Sad 17.7%
Disgusted 8.6%
Surprised 7.6%
Happy 6.3%
Fear 6%
Angry 2.8%
Confused 2%

Feature analysis

Amazon

Person
Person 99.8%

Categories

Text analysis

Amazon

222
HOP
ANCIAL
MOKE
BACCO
UORS
&

Google

222 ANCIAL MOKE HOP UORS
222
ANCIAL
MOKE
HOP
UORS