Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3685

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3685

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
Hat 100
Cap 99.9
Baby 99.5
Person 99.5
Face 96.1
Head 96.1
Overcoat 95.5
Car 92.9
Transportation 92.9
Vehicle 92.9
Photography 84.8
Portrait 84.8
Outdoors 77.5
Bonnet 69.6
Nature 62.1
Snow 62.1
Body Part 56.5
Finger 56.5
Hand 56.5
Beanie 56.1
Winter 55.4
Smoke 55.2

Clarifai
created on 2018-05-10

people 99.5
child 97.2
one 96.9
adult 94.3
wear 93.3
portrait 90.7
woman 90.6
administration 90.5
man 86.7
two 83.5
monochrome 83.1
street 82.1
retro 80.9
boy 77.1
veil 76.7
indoors 76.6
transportation system 75.5
facial expression 75.1
step 74.8
war 74.4

Imagga
created on 2023-10-06

newspaper 18.5
man 18.1
person 16.7
old 16
product 14.6
statue 14.5
people 14.5
portrait 14.2
adult 13.8
male 13.6
scholar 13.1
building 12.7
black 12.6
creation 11.7
sculpture 11.7
architecture 10.9
religion 10.8
face 10.7
device 10.6
stucco 10.6
look 10.5
intellectual 10.5
wall 10.3
historic 10.1
travel 9.9
home 9.6
monument 9.3
head 9.2
art 9.2
city 9.1
dress 9
one 9
indoors 8.8
hair 8.7
ancient 8.6
stone 8.6
historical 8.5
house 8.4
tourism 8.2
light 8
work 7.8
smile 7.8
men 7.7
culture 7.7
room 7.5
human 7.5
religious 7.5
vintage 7.4
window 7.4
single 7.4
world 7.4
back 7.3
tourist 7.2
color 7.2
dirty 7.2
looking 7.2
decoration 7.1
mask 7
clothing 7
wooden 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

window 83.6
posing 40.1

Color Analysis

Feature analysis

Amazon

Baby 99.5%
Person 99.5%
Car 92.9%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

a