Human Generated Data

Title

Untitled (Bleecker Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2887

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Bleecker Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2887

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Coat 100
Jacket 100
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Face 97.6
Head 97.6
Photography 97.6
Portrait 97.6
Cap 96.7
Person 92.9
Hat 82.4
Body Part 72.8
Hand 72.8
Finger 70.3
Overcoat 58
Smoke 55.8
Electrical Device 55.6
Microphone 55.6
Gun 55.6
Weapon 55.6
Baseball Cap 55.4

Clarifai
created on 2018-05-10

people 99.9
adult 97
administration 95
man 93.4
leader 91.1
war 90.5
group together 90.5
one 90.2
three 89.6
group 89.3
retro 87.6
two 87.3
four 85.3
offense 84.2
portrait 83.1
military 82.6
several 82.6
woman 81.4
street 78.6
lid 76.4

Imagga
created on 2023-10-07

shop 40.7
man 39.6
old-timer 39.5
male 34
old 32.7
mercantile establishment 30.8
tobacco shop 28.2
senior 28.1
person 25.3
portrait 25.2
hat 24.5
face 23.4
place of business 20.5
black 17.4
grandfather 16.8
people 16.7
serious 16.2
adult 15.5
men 15.4
elderly 15.3
head 15.1
barroom 14.6
one 13.4
looking 12.8
mature 12.1
barbershop 11.3
hair 11.1
sculpture 10.8
vintage 10.7
hand 10.6
human 10.5
establishment 10.3
close 10.3
glasses 10.2
guy 10.1
work 9.4
culture 9.4
expression 9.4
art 9.2
beard 9.1
handsome 8.9
retired 8.7
building 8.7
couple 8.7
jacket 8.7
lifestyle 8.7
sign 8.3
happy 8.1
gray 8.1
worker 8.1
history 8
eye 8
job 8
business 7.9
architecture 7.8
antique 7.8
ancient 7.8
model 7.8
thoughtful 7.8
mysterious 7.8
western 7.7
retirement 7.7
money 7.6
casual 7.6
skin 7.6
statue 7.6
city 7.5
shirt 7.5
pensioner 7.4
closeup 7.4
occupation 7.3
smiling 7.2
dirty 7.2
home 7.2
smile 7.1
interior 7.1
cowboy 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.4
outdoor 98.9
newspaper 82.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 50-58
Gender Male, 98.4%
Calm 86.6%
Surprised 7.3%
Fear 6.3%
Sad 3.9%
Angry 1.7%
Happy 1.5%
Confused 1.4%
Disgusted 1.3%

AWS Rekognition

Age 38-46
Gender Male, 100%
Angry 77.4%
Calm 9.4%
Surprised 7.5%
Fear 6.1%
Sad 3.8%
Disgusted 3.7%
Confused 1.9%
Happy 0.7%

AWS Rekognition

Age 13-21
Gender Male, 97.4%
Calm 84.2%
Sad 7.9%
Surprised 6.5%
Fear 6.1%
Confused 3.2%
Disgusted 1.3%
Angry 0.5%
Happy 0.2%

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Hat 82.4%

Text analysis

Amazon

OF
TABLES
R
CUTTING
ERS
FIXT
ICE FIXT
ICE
AND
S
S OF BLOC
BLOC
TIN
ERS AND 0
R CUTTING TIN -TABLE TABLES CIDE
-TABLE
0
CIDE

Google

CUTTING TABLE C ERSAOIC FIX
CUTTING
C
ERSAOIC
TABLE
FIX