Human Generated Data

Title

Untitled (Bleecker Street New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2832

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Bleecker Street New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2832

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Person 99.8
Alcohol 98.5
Beer 98.5
Beverage 98.5
Tin 97.4
Face 94.7
Head 94.7
Photography 88.1
Portrait 88.1
Shelf 74
Can 72.9
Can 66.6
Architecture 57.6
Building 57.6
Factory 57.6
Pub 56.9
Closet 55
Cupboard 55
Furniture 55

Clarifai
created on 2018-05-10

people 99.7
adult 97.7
one 95.6
man 95.5
indoors 92.3
portrait 90
education 89.5
sit 89
woman 88
child 83.5
display 78.8
group 76.8
room 76.7
classroom 76.2
science 75.6
administration 75.6
school 74.5
wear 70.9
teacher 68.3
monochrome 68.3

Imagga
created on 2023-10-07

call 34.9
telephone 29.1
equipment 23.9
technology 22.2
electronic equipment 22.1
man 21.5
person 19.6
male 19.1
people 17.3
computer 16.1
looking 16
dial telephone 15.3
device 13.6
business 13.4
black 13.2
adult 13
child 12.9
home 12.8
interior 12.4
office 12.1
happy 11.9
old 11.8
working 11.5
indoors 11.4
close 11.4
cellular telephone 11.4
work 11
smiling 10.8
one 10.4
television 10.4
elevator 10.2
communication 10.1
hand 9.9
look 9.6
dial 9.2
radiotelephone 9.1
button 8.8
lifestyle 8.7
cute 8.6
sitting 8.6
smile 8.5
vintage 8.5
face 8.5
laptop 8.5
portrait 8.4
attractive 8.4
room 8.3
phone 8.3
human 8.2
pay-phone 8.2
indoor 8.2
lifting device 8.2
music 8.1
shop 8
job 8
control 7.6
finance 7.6
closeup 7.4
retro 7.4
alone 7.3
digital 7.3
bartender 7.2
handsome 7.1
to 7.1
businessman 7.1
happiness 7
medicine 7
modern 7

Microsoft
created on 2018-05-10

person 99.1
man 95.4
window 92.7
black 68.4
old 57.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 41-49
Gender Female, 98.9%
Angry 53.4%
Calm 38.2%
Surprised 6.8%
Fear 6%
Confused 3.4%
Sad 3%
Disgusted 0.7%
Happy 0.2%

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Can 72.9%

Categories

Imagga

people portraits 99.4%

Text analysis

Amazon

OLIO
La Preferita
LaPreferita
P
CONVER
INVE
0
you
CONVER INVE BY
0 PURP DO you OLIO D'IDUZIO D'OLIVE
DO
PURP
succa
OLIVAI
oil
succa GLIVE oil
D'OLIVE
GLIVE
Preferita
OLIVAI VENTO
BY
VENTO
HOTEL
NOTA
DIA
HOTEL DATE DIA
ITALIA
DATE
SASTALE
D'IDUZIO
asics