Human Generated Data

Title

Untitled (Webster Street, Between Geary and O'Farrell Streets, San Francisco)

Date

September 30, 1948

People

Artist: Minor White, American 1908 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.172

Copyright

© The Trustees of Princeton University

Human Generated Data

Title

Untitled (Webster Street, Between Geary and O'Farrell Streets, San Francisco)

People

Artist: Minor White, American 1908 - 1976

Date

September 30, 1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.172

Copyright

© The Trustees of Princeton University

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.5
Human 99.5
Clothing 92.4
Apparel 92.4
Person 82.5
Sitting 64.5
Door 62.2
Floor 58.7
Alloy Wheel 55.9
Machine 55.9
Wheel 55.9
Spoke 55.9
Furniture 55.4

Clarifai
created on 2023-10-28

street 99.9
people 99.8
one 98.8
monochrome 98.7
man 96.7
adult 95.7
city 94.8
pavement 93.6
portrait 93.5
newspaper 92.9
sit 91.7
fatigue 91.5
wait 90.8
woman 89.2
wear 88.2
dog 87.9
child 86.4
loneliness 82.8
furniture 81.7
sitting 81.6

Imagga
created on 2022-01-30

shoe shop 87.7
shop 84.5
mercantile establishment 63.2
place of business 42.2
building 25.1
wall 21.4
establishment 21.1
door 20.2
chair 19.9
window 19.2
architecture 18.7
old 18.1
city 17.5
barbershop 16.8
house 16.7
barber chair 15.3
people 14.5
urban 14
adult 12.9
seat 12.3
man 11.4
casual 11
black 10.8
person 10.6
travel 10.6
concrete 10.5
one 10.4
stone 10.1
shoe 9.7
indoors 9.7
home 9.6
floor 9.3
street 9.2
dirty 9
stairs 8.8
ancient 8.6
construction 8.6
portrait 8.4
town 8.3
footwear 8.3
furniture 8.3
hairdresser 8.2
business 7.9
abandoned 7.8
brick 7.5
wood 7.5
structure 7.4
style 7.4
male 7.1
wooden 7
modern 7

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

text 97
clothing 91.6
person 88.3
footwear 78
street 59.7
black and white 58.2
man 58

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 98%
Sad 33%
Fear 20%
Calm 17%
Confused 12.2%
Angry 6.9%
Surprised 4.7%
Disgusted 3.6%
Happy 2.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 82.5%

Categories

Imagga

paintings art 97.7%
interior objects 1.2%

Text analysis

Amazon

crippled
get
195
Or get crippled
Or

Google

195 Or get crippled
195
Or
get
crippled