Human Generated Data

Title

Untitled (Lower East Side, New York City)

Date

April 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2881

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Lower East Side, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

April 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2881

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-04

Person 99.6
Human 99.6
Person 98.4
Apparel 98.2
Clothing 98.2
Advertisement 97.9
Poster 97.9
Text 95
Face 94
Hat 88.6
Person 86.3
Word 84.7
Coat 84.5
Overcoat 84.5
Suit 84.5
Hat 83.8
Hat 80.2
Beard 69.9
Person 50.8

Clarifai
created on 2018-03-23

people 98.9
man 96.8
adult 96.1
bill 95.7
portrait 95.7
retro 95.5
illustration 88.3
administration 86.7
wear 85.8
group 84.3
lid 84.1
monochrome 82.7
music 80
art 79.9
vintage 79.1
text 77.7
woman 77.4
one 76.1
leader 75.6
street 75.2

Imagga
created on 2018-03-23

book jacket 100
jacket 96.2
wrapping 73.1
covering 49.2
vintage 24.8
retro 23.8
old 22.3
grunge 21.3
blackboard 19.9
sign 18.1
black 18
stamp 17.5
mail 17.2
money 17
dollar 16.7
currency 16.2
postmark 15.8
letter 15.6
message 15.5
aged 15.4
symbol 14.8
card 14.5
postage 13.8
cash 13.7
ancient 13
design 12.9
banking 12.9
finance 12.7
envelope 12.5
business 12.2
art 11.9
comic book 11.9
bank 11.7
financial 11.6
bill 11.4
texture 11.1
philately 10.9
circa 10.9
printed 10.8
collection 10.8
postal 10.8
dollars 10.6
textured 10.5
text 10.5
drawing 10.4
paper 10.2
man 10.1
shows 9.9
billboard 9.5
wall 9.4
dirty 9
painting 9
wealth 9
chalk 8.8
exchange 8.6
economy 8.4
note 8.3
investment 8.3
graphic 8
graffito 8
chalkboard 7.8
blank 7.7
decoration 7.6
savings 7.5
close 7.4
market 7.1

Google
created on 2018-03-23

poster 82.1
human behavior 75.9
album cover 71.4
art 67.3
facial hair 66.8
black and white 66.7
font 59.9
album 56.7
advertising 55.7

Microsoft
created on 2018-03-23

text 99.3
book 93.1
person 91.4
outdoor 90.6
black 81.8
old 69.7
people 56.6
posing 41.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 99.4%
Disgusted 0.8%
Confused 3.4%
Sad 6%
Surprised 3.9%
Calm 48.1%
Angry 1.8%
Happy 36%

AWS Rekognition

Age 30-47
Gender Male, 99.7%
Calm 22.8%
Happy 2.3%
Confused 1.8%
Disgusted 0.5%
Sad 70.5%
Angry 1.4%
Surprised 0.7%

AWS Rekognition

Age 38-59
Gender Male, 98.5%
Disgusted 0.4%
Calm 84.7%
Happy 1%
Surprised 8.3%
Sad 1.5%
Angry 2.7%
Confused 1.3%

Microsoft Cognitive Services

Age 87
Gender Female

Microsoft Cognitive Services

Age 81
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Poster 97.9%
Hat 88.6%
Suit 84.5%

Text analysis

Amazon

RSETS
SERS
MAUE
SERS S
S

Google

RSETS S ER S
RSETS
S
ER