Human Generated Data

Title

Untitled (worker digging in excavation site)

Date

c. 1930-1940, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5745

Human Generated Data

Title

Untitled (worker digging in excavation site)

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5745

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Urban 84.7
Nature 84.1
Building 78.5
Wood 77.3
Outdoors 66.1
Handrail 57.7
Banister 57.7

Clarifai
created on 2019-11-16

monochrome 97.8
people 97.6
transportation system 95.7
vehicle 94.7
industry 93.9
waste 93
train 92.7
black and white 92.4
railway 90.8
group 90.4
abandoned 88.9
no person 88.8
street 88
two 87.8
calamity 87.1
grinder 86.5
war 86.5
dust 85.8
broken 84.6
vintage 80.3

Imagga
created on 2019-11-16

sky 22.3
structure 19.5
architecture 19.5
building 18.7
winter 17
device 16.3
wheeled vehicle 15.6
old 15.3
handcart 15.2
wood 15
landscape 14.9
travel 14.8
tree 14.7
track 14.6
transportation 14.3
snow 14.2
transport 13.7
light 13.4
water 13.3
scene 13
construction 12.8
outdoor 12.2
industry 11.9
steel 11.8
support 11.6
river 11.6
conveyance 11.3
metal 11.3
house 10.9
barrier 10.7
chair 10.5
outdoors 10.4
forest 10.4
cold 10.3
empty 10.3
equipment 10.3
industrial 10
city 10
park 9.9
barrow 9.5
shopping cart 9.5
machine 9.4
beach 9.4
wall 9.2
business 9.1
tower 8.9
vehicle 8.9
work 8.6
sea 8.6
outside 8.6
window 8.5
exterior 8.3
road 8.1
man 8.1
trees 8
home 8
rural 7.9
urban 7.9
season 7.8
tool 7.8
cloud 7.7
seat 7.7
frozen 7.6
bridge 7.6
silhouette 7.4
tourism 7.4
street 7.4
mountain 7.1
summer 7.1
fence 7.1
day 7.1
wooden 7
factory 7
obstruction 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

ground 96.1
text 93.4
black and white 92.1

Color Analysis

Categories

Captions

Microsoft
created on 2019-11-16

a person sitting on a bench 35.9%
a bench in front of a building 35.8%

Text analysis

Amazon

E