Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (Ezra Shahn, New York City)

Date

1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4234

Human Generated Data

Title

Untitled (Ezra Shahn, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4234

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.1
Human 98.1
Person 96.9
Person 96.5
Person 92.6
Person 91.3
People 83.6
Porch 68.2
Military 58
Clothing 55.2
Apparel 55.2

Clarifai
created on 2023-10-25

people 99.5
movie 99.4
negative 98.7
slide 98.2
vintage 98.2
filmstrip 97.7
group 97.6
wear 97.3
collage 97.3
retro 96.8
adult 96.8
picture frame 96.2
margin 95.4
old 95.2
two 94.7
art 93.8
woman 92.5
sepia 92
cinematography 91.2
man 90.8

Imagga
created on 2022-01-08

musical instrument 38.8
brass 37
architecture 35.1
wind instrument 31.9
history 31.3
sculpture 30.8
building 27.9
old 27.9
monument 26.1
statue 25.1
ancient 25.1
stone 24.5
historic 23.8
tourism 23.1
culture 23.1
travel 22.5
art 21.2
military uniform 20.2
landmark 19.9
religion 19.7
facade 18.8
historical 18.8
uniform 18.4
column 16.5
city 15.8
wall 15.5
window 15
famous 14.9
temple 13.9
church 13.9
marble 13.7
clothing 11.9
cornet 11.5
exterior 11.1
tourist 11
antique 10.6
percussion instrument 10.4
sky 10.2
town 10.2
balcony 9.7
detail 9.6
god 9.6
symbol 9.4
structure 9.4
palace 9.4
accordion 9.2
carving 9.2
device 9.1
memorial 8.8
house 8.8
marimba 8.7
arch 8.7
roof 8.6
keyboard instrument 8.4
covering 8.1
consumer goods 8.1
day 7.8
heritage 7.7
classical 7.6
capital 7.6
buildings 7.6
destination 7.5
people 7.2
decoration 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

clothing 97.1
person 96.7
indoor 94.2
text 87.2
man 84.5
old 70.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 97.8%
Calm 63.1%
Sad 25.2%
Fear 3.2%
Angry 2.2%
Confused 2.1%
Disgusted 2%
Happy 1.4%
Surprised 0.8%

Feature analysis

Amazon

Person
Person 98.1%

Categories

Imagga

interior objects 99.8%

Captions

Microsoft
created on 2022-01-08

an old photo of a train 49.5%
old photo of a train 45.2%
an old photo of a train station 45.1%

Text analysis

Google

BEJ 3 0
BEJ
3
0