Human Generated Data

Title

Untitled (man with gun leaning on fence)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7715

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man with gun leaning on fence)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7715

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Face 90.6
Skin 84.2
Fence 78.6
Portrait 71
Photography 71
Photo 71
Outdoors 70.6
Hunting 65
Nature 64.4
Yard 58.2
Weapon 56.3
Weaponry 56.3

Clarifai
created on 2023-10-25

people 99.9
adult 98.5
group 98.1
group together 97.5
two 97.3
weapon 96.3
gun 96
one 95.9
man 95.4
war 94.6
wear 94.5
rifle 94
outfit 93.5
three 93.2
military 92.9
soldier 91.5
street 91.2
several 89.7
portrait 89.5
uniform 88.8

Imagga
created on 2022-01-09

man 28.9
male 20.6
people 19.5
old 18.8
sculpture 18.2
person 17.2
passenger 16.5
statue 16.3
newspaper 16.1
men 15.4
world 15.3
picket fence 14.1
musical instrument 13.9
hair 12.7
product 12.4
face 12.1
fence 12
architecture 11.7
art 11.7
portrait 11.6
happy 11.3
ancient 11.2
adult 11
vintage 10.7
building 10.7
looking 10.4
home 10.4
grandfather 10.3
love 10.3
black 10.2
historic 10.1
creation 9.6
antique 9.5
wind instrument 9.5
head 9.2
leisure 9.1
lady 8.9
barrier 8.9
family 8.9
lifestyle 8.7
child 8.6
monument 8.4
joy 8.3
city 8.3
human 8.2
religion 8.1
business 7.9
urban 7.9
smile 7.8
happiness 7.8
outdoor 7.6
casual 7.6
females 7.6
house 7.5
outdoors 7.5
one 7.5
window 7.3
dress 7.2
history 7.2
travel 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.9
statue 97.2
person 93.4
man 92.7
black and white 85.9
clothing 79.8
human face 69.3
sculpture 56.8

Color Analysis

Feature analysis

Amazon

Person 99.3%

Categories

Text analysis

Amazon

KODAKAA
in