Human Generated Data

Title

California Ranch

Date

c. 1930

People

Artist: William Edward Dassonville, American 1879 - 1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, P2000.64

Human Generated Data

Title

California Ranch

People

Artist: William Edward Dassonville, American 1879 - 1957

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, P2000.64

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Fence 84.1
Nature 70.4
Building 66.9
Outdoors 64.9
Picket 59

Clarifai
created on 2023-10-25

fence 99.2
monochrome 97.4
fog 96.4
no person 96.2
house 95
architecture 94.4
landscape 93.1
building 92.9
light 92.3
people 92.3
water 91.9
beach 91.6
street 91.3
winter 90.7
window 90.2
shadow 88.8
bridge 88.8
family 87.1
nature 86.7
sunset 86.1

Imagga
created on 2022-01-08

architecture 33.9
fence 31.6
railing 29.8
building 28.7
structure 27.8
balcony 27.5
picket fence 26
wall 24.3
barrier 21.4
city 19.1
travel 18.3
sky 17.8
house 15.2
landscape 14.1
old 13.9
construction 13.7
obstruction 13.2
tourism 12.4
furniture 11.4
buildings 11.3
metal 11.3
home 11.2
brick 11.1
exterior 11.1
stone 11
wood 10.8
panel 10.7
water 10
floor 9.3
outdoor 9.2
facade 9
outdoors 9
light 8.7
ancient 8.6
street 8.3
landmark 8.1
history 8
trees 8
steel 8
wooden 7.9
urban 7.9
high 7.8
industry 7.7
hotel 7.6
room 7.6
support 7.5
window 7.5
step 7.4
business 7.3
new 7.3
coast 7.2
tower 7.2
prison 7.1
sea 7

Google
created on 2022-01-08

Building 94.9
Sky 87.2
Fence 85.9
House 83.8
Rectangle 79
Tints and shades 76.8
Monochrome photography 71.2
Monochrome 70.9
Window 68.1
Wood 67.2
History 64.8
Picket fence 64.2
Landscape 63.7
Room 63.6
Art 62.3
Stock photography 62
Metal 60.2
Tower 60.1
Facade 59.9
Home fencing 59.6

Microsoft
created on 2022-01-08

sky 88.7
house 86.3
white 74.7
building 71
black 70.1
black and white 65.7
text 60.7
fog 59.7

Color Analysis

Categories

Imagga

interior objects 99.9%

Captions

Text analysis

Amazon

29

Google

29
29