Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (woman on dirt road with baby carriage and dog carrying bucket)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15742

Human Generated Data

Title

Untitled (woman on dirt road with baby carriage and dog carrying bucket)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15742

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Nature 99.8
Person 99.5
Human 99.5
Outdoors 99.4
Blizzard 97.8
Snow 97.8
Winter 97.8
Storm 97.8
Weather 65.9
Wilderness 59.6
Brick 55.6
Ice 55.3

Clarifai
created on 2023-10-29

people 99
monochrome 98.9
snow 98.1
winter 98.1
tree 97.2
child 96.6
black and white 96.5
wood 96.1
man 95.4
no person 93.7
cold 93.4
infrared 92.7
retro 92.1
vintage 91.9
alone 91.8
vehicle 91.5
park 91.3
landscape 91.2
sepia 89.8
fog 88.4

Imagga
created on 2022-02-05

grunge 49.4
old 39.7
vintage 37.2
television 36.4
texture 36.1
antique 32.9
negative 29.8
film 29.6
blackboard 29.2
retro 27.9
aged 27.1
frame 26.7
grungy 26.6
telecommunication system 26
pattern 23.9
border 23.5
damaged 22.9
monitor 22.7
art 22.1
rough 21.9
material 21.4
graphic 21.2
textured 21
space 20.9
structure 19.1
weathered 19
design 18.6
ancient 18.2
paint 18.1
dirty 18.1
wall 18
black 17.4
forest 17.4
paper 17.3
screen 16.4
dark 15.9
landscape 15.6
snow 15
tree 14.7
empty 14.6
messy 14.5
rust 14.4
edge 14.4
text 14
aging 13.4
rusty 13.3
decoration 13.1
noise 12.7
dirt 12.4
backdrop 12.4
old fashioned 12.4
wallpaper 12.3
digital 12.2
grain 12
photographic paper 11.7
decay 11.6
scene 11.3
blank 11.1
color 11.1
noisy 10.8
element 10.7
scratch 10.7
fracture 10.7
movie 10.7
surface 10.6
obsolete 10.5
electronic equipment 10.3
winter 10.2
photographic 9.8
frames 9.8
grime 9.8
mottled 9.7
stains 9.7
crumpled 9.7
collage 9.6
canvas 9.5
cold 9.5
light 9.4
equipment 9.2
computer 9.1
designed 8.9
layered 8.8
mess 8.8
burned 8.8
slide 8.8
country 8.8
ragged 8.8
faded 8.8
crack 8.7
your 8.7
layer 8.7
fog 8.7
detailed 8.7
stain 8.6
mask 8.6
spot 8.6
wood 8.3
sky 8.3
trees 8
rural 7.9
highly 7.9
broad 7.9
photographic equipment 7.8
tracery 7.8
season 7.8
succulent 7.8
photograph 7.8
torn 7.7
stained 7.7
parchment 7.7
outdoor 7.6
outdoors 7.5
silhouette 7.4
brown 7.4
historic 7.3
road 7.2
weather 7.1
cool 7.1

Microsoft
created on 2022-02-05

text 95.8
snow 92.8
black and white 92.7
tree 90.3
white 76
monochrome 73.9
fog 64.4
grave 53.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 98.3%
Calm 99.9%
Happy 0.1%
Confused 0%
Angry 0%
Sad 0%
Fear 0%
Disgusted 0%
Surprised 0%

Feature analysis

Amazon

Person
Person 99.5%

Categories

Captions

Microsoft
created on 2022-02-05

a bench in front of a window 36.6%
a sign on a window sill 36.5%
a close up of a window 36.4%