Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3021

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3021

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Adult 99
Male 99
Man 99
Person 99
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Person 98.6
Person 98.6
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Person 95.5
Person 94.6
Face 92
Head 92
Outdoors 89.5
Person 87.6
Smoke 87.4
Cap 86.7
Person 84.3
War 81.5
Nature 78.8
Overcoat 78.4
Coat 72.6
City 70.7
Person 70.2
Hat 69.2
Hat 64.5
Snow 63.4
Fog 57.3
Smog 57.3
Weather 57.3
Road 56.4
Street 56.4
Urban 56.4
Jacket 55.5
Pollution 55

Clarifai
created on 2018-05-10

people 99.8
group 97.4
adult 96.5
war 96
many 94.8
military 94.7
group together 94.3
administration 93.6
man 92.8
soldier 89.3
wear 86.5
uniform 85.4
woman 84.9
child 82.9
outfit 82.3
offense 81.6
police 81.4
weapon 80.4
skirmish 80
street 76.9

Imagga
created on 2023-10-06

newspaper 31.9
daily 24.6
product 24.3
old 20.2
man 19.5
creation 19
shop 17.3
male 14.9
vintage 14
person 13.6
city 13.3
people 12.8
business 12.7
history 12.5
architecture 11.7
art 11.7
building 10.8
mercantile establishment 10.8
ancient 10.4
fire 9.4
finance 9.3
portrait 9
black 9
light 8.7
stall 8.7
antique 8.6
sign 8.3
historic 8.2
financial 8
work 7.8
construction 7.7
wall 7.7
industry 7.7
grunge 7.7
money 7.6
god 7.6
safety 7.4
dirty 7.2
place of business 7.2
religion 7.2
working 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.9
people 95.9
outdoor 95.3
group 94.3
standing 85.4
crowd 1.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-22
Gender Male, 98.8%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 1.6%
Confused 1.6%
Angry 0.3%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 13-21
Gender Male, 98.2%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 5.4%
Angry 0.3%
Confused 0.2%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 13-21
Gender Female, 62.6%
Sad 43.3%
Calm 41%
Fear 10.6%
Surprised 9.4%
Angry 6.4%
Happy 5.4%
Disgusted 4.4%
Confused 3.2%

Microsoft Cognitive Services

Age 34
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Coat 72.6%
Hat 69.2%

Text analysis

Amazon

12
RATE
STORE
CUT RATE DRUG STORE
CUT
DA
DRUG
LUNCHETTE
AUSTIN'S
SALE
EON
35
STERY
IS EON 35 12 SHOA SALE
CUT RATE
DA TUNCHLONETTE CUT RATE BACK STORE
TUNCHLONETTE
SPEAR A D
SPAZE
ALL
il
IT SPAZE
SHOA
the
BACK
HIS
IT
IS

Google

SPEAR UT RATE DRIUG STORE IU İLUNCHETTE
SPEAR
UT
RATE
DRIUG
STORE
IU
İLUNCHETTE