Human Generated Data

Title

Worn Place in Rail, S. Bound Track, Pleasant St.

Date

September 14, 1901

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.2

Human Generated Data

Title

Worn Place in Rail, S. Bound Track, Pleasant St.

People

Artist: Unidentified Artist,

Date

September 14, 1901

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.2

Machine Generated Data

Tags

Amazon
created on 2022-06-04

Person 97.4
Human 97.4
Plant 95.7
Bench 87.6
Furniture 87.6
Clothing 84.3
Apparel 84.3
Shorts 82.3
Soil 79.2
Produce 74.2
Food 74.2
Grain 68.4
Vegetable 68.4
Shoe 65.6
Footwear 65.6
Seed 61.8
Outdoors 59.2
Garden 58.5

Imagga
created on 2022-06-04

apiary 90.6
shed 71.5
outbuilding 53.8
building 37.8
structure 23.5
track 22.7
wall 21.4
doormat 20.8
old 18.8
tie 18.6
mat 18.5
wooden 16.7
wood 16.7
texture 15.3
sill 15.1
farmer 15
support 14.8
brace 14.7
structural member 14.4
device 14.1
brown 14
pattern 13.7
weathered 13.3
floor cover 12.5
textured 12.3
person 12.2
rough 11.8
dirty 11.7
metal 11.3
construction 11.1
grunge 11.1
covering 11
strengthener 11
aged 10.8
material 10.8
vintage 10.7
train 10.6
surface 10.6
rail 9.8
steel 9.7
urban 9.6
house 9.2
backdrop 9.1
outdoors 9
people 8.9
railroad 8.8
windowsill 8.6
line 8.6
tree 8.5
city 8.3
transportation 8.1
natural 8
home 8
railway 7.8
timber 7.8
travel 7.7
outdoor 7.6
rusty 7.6
grungy 7.6
healthy 7.6
iron 7.5
row 7.4
retro 7.4
design 7.3
transport 7.3
detail 7.2
board 7.2
male 7.1
working 7.1
architecture 7

Google
created on 2022-06-04

Microsoft
created on 2022-06-04

bench 99.9
outdoor 98.8
black and white 97
person 95.7
park 89.2
monochrome 75.1
street 66.8
footwear 59

Color Analysis

Feature analysis

Amazon

Person 97.4%
Bench 87.6%
Shoe 65.6%

Captions

Text analysis

Google

LIN