Human Generated Data

Title

Untitled (Lower East Side, New York City)

Date

November 1935-1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2904

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Lower East Side, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1935-1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2904

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Apparel 93.3
Clothing 93.3
Human 92.9
Person 90.5
Face 83.6
Female 79.7
Photography 63.4
Portrait 63.4
Photo 63.4
Woman 61.5

Clarifai
created on 2018-03-23

people 100
two 99.5
adult 99.3
wear 99.1
one 98.8
veil 98.7
woman 96.8
man 96
group 94
child 93.5
three 93.5
portrait 93.3
headscarf 93
administration 91.7
music 91.6
elderly 91.5
four 89.4
offspring 87.5
transportation system 86.1
interaction 85.9

Imagga
created on 2018-03-23

person 30.6
man 30.2
male 28.4
black 27.8
robe 24.4
adult 23.3
clothing 20.7
people 20.6
mask 20
garment 18.6
portrait 16.2
danger 15.4
face 14.9
criminal 13.7
covering 13.5
wind instrument 13.2
musical instrument 12.9
fashion 12.8
dark 12.5
call 12.5
holding 12.4
crime 11.7
looking 11.2
men 11.2
model 10.9
weapon 10.7
free-reed instrument 10.4
professional 10.3
disguise 10.3
device 10.2
dress 9.9
disk jockey 9.9
old 9.7
evil 9.7
hat 9.6
costume 9.5
sitting 9.4
security 9.2
protection 9.1
handsome 8.9
coat 8.8
death 8.7
happiness 8.6
eyes 8.6
expression 8.5
business 8.5
clothes 8.4
hand 8.3
gun 8.3
safety 8.3
alone 8.2
laptop 8.2
music 8.2
happy 8.1
worker 8.1
broadcaster 7.9
armed 7.9
soldier 7.8
army 7.8
attractive 7.7
mystery 7.7
human 7.5
one 7.5
outdoors 7.5
emotion 7.4
uniform 7.3
sax 7.3
computer 7.2
communicator 7.2
building 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 96.8
man 93.9
black 85.3
white 74
old 67.7
older 30.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 35-52
Gender Male, 77.6%
Happy 0.2%
Disgusted 0.7%
Calm 46.6%
Surprised 2.1%
Angry 2.2%
Sad 46.4%
Confused 1.9%

Microsoft Cognitive Services

Age 76
Gender Female

Feature analysis

Amazon

Person 90.5%

Captions

Text analysis

Amazon

TANDARD
WITTLI
TWELVES

Google

WELVES TANDARD
WELVES
TANDARD