Human Generated Data

Title

Untitled (men and a woman standing on the road with rifles)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5132

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and a woman standing on the road with rifles)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99
Person 99
Person 98.8
Person 98.4
Outdoors 94.4
Nature 93.8
Person 91.3
Water 90.6
Person 90.5
Person 89.9
Ice 78.3
Person 71.7
Snow 66.7

Imagga
created on 2022-01-23

snow 97.5
weather 53.8
structure 32.3
negative 31.9
winter 29.8
forest 28.7
landscape 28.3
fountain 27.2
tree 25.8
grunge 25.6
film 25.5
cold 23.3
ice 21.3
season 19.5
texture 19.5
frost 19.2
photographic paper 18.3
trees 17.8
vintage 17.4
pattern 17.1
old 16.7
sky 16.7
frozen 15.3
cool 15.1
natural 14.7
design 14.6
outdoors 14.2
wood 14.2
wallpaper 13.8
art 13.7
light 13.4
retro 13.1
scene 13
fog 12.6
river 12.5
photographic equipment 12.2
antique 12.1
water 12
dirty 11.8
frame 11.7
park 11.6
country 11.4
artistic 11.3
aged 10.9
border 10.9
mountain 10.8
scenery 10.8
black 10.8
outdoor 10.7
mist 10.6
rural 10.6
textured 10.5
architecture 10.2
dark 10
city 10
frosty 9.8
backgrounds 9.7
freeze 9.7
scenic 9.7
grungy 9.5
paper 9.4
space 9.3
travel 9.2
rough 9.1
environment 9.1
morning 9
seasonal 8.8
day 8.6
woods 8.6
ecology 8.3
peaceful 8.2
branch 8.2
fall 8.2
wall 8.1
drawing 8
bright 7.9
misty 7.9
rock 7.8
ancient 7.8
snowy 7.8
scenics 7.7
weathered 7.6
canvas 7.6
crystal 7.5
graphic 7.3
paint 7.2
color 7.2
fantasy 7.2
history 7.2
building 7.1
sprinkler 7.1
surface 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.9
tree 99.1
outdoor 93.2
black and white 69.3

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 94.5%
Sad 62.7%
Calm 20.7%
Confused 10.9%
Disgusted 3.7%
Angry 1.1%
Fear 0.4%
Happy 0.3%
Surprised 0.2%

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a sign on a dirt road 65.5%
an old photo of a person 39%
a sign on the side of a dirt field 38.9%

Text analysis

Amazon

15188
MJ17
MJ17 A70A
A70A

Google

15188.
MJ17
15)88. 15188. MJ17 YT33A2 A73 A
15)88.
A73
A
YT33A2