Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (people outside under a striped tent near long table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5332

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people outside under a striped tent near long table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5332

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-03-12

Person 99.5
Human 99.5
Market 99.4
Person 99.1
Person 98.6
Person 98.3
Person 98.1
Person 97.7
Person 97.4
Bazaar 96.8
Shop 96.8
Person 95.6
Person 90.1
Home Decor 72.5
Plant 71
People 62
Tent 58.3
Suit 56.2
Clothing 56.2
Coat 56.2
Overcoat 56.2
Apparel 56.2

Clarifai
created on 2025-01-09

people 99.9
many 98.9
man 98.6
group together 97.5
adult 97.2
group 95.9
umbrella 94.8
street 93.4
military 93.3
monochrome 93.2
merchant 92.1
two 90.3
war 89.1
woman 88.9
market 88.2
commerce 87.8
crowd 84.6
wear 84.5
one 81.1
four 81.1

Imagga
created on 2022-03-12

musical instrument 31
accordion 19.3
man 18.8
keyboard instrument 16.4
newspaper 15.4
travel 14.8
old 14.6
people 14.5
stall 14.3
male 14.2
person 14
work 13.4
job 13.3
building 12.6
house 12.5
vacation 12.3
wind instrument 12
home 12
product 11.9
car 11.8
umbrella 10.9
outdoor 10.7
architecture 10.1
city 10
worker 9.8
business 9.7
summer 9.6
parasol 9.6
holiday 9.3
creation 9.2
hand 9.1
tourism 9.1
music 9
working 8.8
men 8.6
construction 8.6
outside 8.5
adult 8.5
industry 8.5
seller 8.4
vehicle 8.3
transport 8.2
religion 8.1
scholar 8.1
machine 8
urban 7.9
automobile 7.7
holding 7.4
looking 7.2
transportation 7.2
to 7.1
sky 7

Google
created on 2022-03-12

Black 89.6
Coat 89.3
Black-and-white 86.1
Style 83.9
Adaptation 79.3
Motor vehicle 78.6
Tints and shades 76.9
Monochrome photography 74.2
Suit 73.7
Hawker 72.3
Monochrome 71.9
Food 71.4
Tent 70.7
Table 69.8
Shade 69.3
Selling 68.3
Plant 66.4
Market 65.7
History 65.5
Tablecloth 61.9

Microsoft
created on 2022-03-12

black and white 96.1
funeral 88
clothing 85.8
outdoor 85.5
person 83.9
monochrome 81.1
man 75.4
white 71.6
tent 67
people 58
clothes 22.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 70.4%
Surprised 99.2%
Confused 0.3%
Calm 0.3%
Happy 0.1%
Sad 0.1%
Fear 0%
Angry 0%
Disgusted 0%

Feature analysis

Amazon

Person
Tent
Person 99.5%

Categories

Text analysis

Amazon

17526.
1752.5

Google

.S 5 ר 2 5 רא כYTEAQ--XAMTeA -- AG אo א"
.S
5
ר
2
רא
כYTEAQ--XAMTeA
--
AG
אo
א"