Human Generated Data

Title

Untitled (Bowery, New York City)

Date

April 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3023

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Bowery, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

April 1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Advertisement 97.1
Human 93.2
Person 93.2
Paper 90.7
Brochure 90.7
Flyer 90.7
Person 90.4
Apparel 76.1
Clothing 76.1
Suit 76
Coat 76
Overcoat 76
Person 65.5
Word 65.1
Military 62
Military Uniform 62
Officer 62
Billboard 56.6
Poster 51.4
Person 47

Clarifai
created on 2018-03-23

people 99.9
adult 98.6
group 98.5
group together 97.7
war 97.1
military 95.8
one 95.5
administration 95.2
man 94.4
portrait 94.2
wear 93.1
police 92.8
two 92.3
street 92.3
vehicle 92
several 91.4
outfit 91.3
offense 90.5
woman 90.4
actress 90.1

Imagga
created on 2018-03-23

blackboard 23.2
grunge 23
book jacket 21.2
man 19.5
comic book 19.1
jacket 17.6
old 16.7
person 16.4
vintage 15.8
business 15.8
male 15.6
art 15.4
black 14.2
people 13.9
dirty 13.6
wrapping 13.5
currency 13.5
silhouette 13.2
slick 13.2
money 11.9
covering 11.6
businessman 11.5
success 11.3
education 11.3
antique 11.2
texture 11.1
stall 10.7
design 10.7
retro 10.6
portrait 10.4
newspaper 10.3
finance 10.1
symbol 10.1
financial 9.8
stamp 9.7
diagram 9.6
graphic 9.5
poster 9.4
product 9.3
letter 9.2
aged 9
pattern 8.9
postmark 8.9
ancient 8.6
mail 8.6
sketch 8.6
face 8.5
dollar 8.4
world 8.3
sign 8.3
human 8.3
cash 8.2
style 8.2
text 7.9
teaching 7.8
men 7.7
suit 7.7
print media 7.7
card 7.7
chart 7.6
damaged 7.6
bill 7.6
hand 7.6
city 7.5
one 7.5
economy 7.4
student 7.4
religion 7.2
creation 7.1
idea 7.1
market 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

sign 99.6
text 98.2
person 88.5
outdoor 88.4
standing 79.5
old 78.3
black 78.1
posing 37.3
sale 11.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 98.3%
Happy 0.2%
Disgusted 0.6%
Angry 1.3%
Surprised 0.9%
Sad 1%
Calm 94.5%
Confused 1.5%

Microsoft Cognitive Services

Age 55
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.2%
Suit 76%
Poster 51.4%

Captions

Microsoft

a man standing in front of a white sign with black text 90.6%
a group of people standing in front of a sign 90.5%
a black sign with white text above a photo of a man 90.4%

Text analysis

Amazon

BOUGHT
NG
SOLD
50
US
ND
50 50 50
FRANKS
MEAT
HASH
HERRING
OND
BEEE
AND
9
C
IRN
OND w C NG BOUGHT AND
BERNS
OR
8 9
EGGS
8
FRES
URGER
ITRA
CaRNed BEEE HASH
STEAK
bagak
OMiTo HERRING
AM
AM OR bagak AL EGGS
FIRMTOURGER URGER STEAK
MEAT BAKHS SPnCHEIr
FRANKS All BERNS KRREI
BAKHS
HUo
UHA
HUo IRN C UHA
KRREI
AL
OWAY
FIRMTOURGER
OMiTo
oRTE
CofFEr oRTE
CofFEr
Riors FRES NRN
CaRNed
CONLZAY
SPnCHEIr
w
Riors
NRN
6
All

Google

FR
AND
US CONW FR BOUGHT SOLD AND OND ND 50 s 50
BOUGHT
ND
OND
s
US
CONW
SOLD
50