Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.859

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.859

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Machine 95.2
Wheel 95.2
Engine 93
Motor 93
Outdoors 81.1
Person 74.9
Nature 70.1
Transportation 65.6
Vehicle 65.6
Railway 62.2
Train 62.2
Wheel 60.5
Countryside 57.7
Architecture 57.7
Building 57.7
Factory 57.7
Shelter 57.4
Rural 56

Clarifai
created on 2018-05-11

people 99.2
adult 95.8
door 94.9
doorway 94.5
no person 91.5
wear 91
one 90.8
wood 90.7
war 90.4
group 90.3
room 89.3
two 88.7
group together 87
vehicle 86.8
home 86.5
military 85.9
shed 85.3
transportation system 83.1
man 82
soldier 79.2

Imagga
created on 2023-10-05

old 20.2
shopping cart 17.1
wheeled vehicle 17.1
building 16.8
device 15.3
handcart 14.2
dirty 12.6
machine 12.4
container 12
equipment 11.9
architecture 11.7
cell 11.3
wall 11.1
industry 11.1
city 10.8
work 10.5
outdoors 10.4
gas pump 10.2
forklift 10.1
locker 10.1
window 10.1
industrial 10
door 9.9
pay-phone 9.9
urban 9.6
construction 9.4
light 9.4
water 9.3
pump 9.3
street 9.2
house 9.2
outdoor 9.2
vehicle 9.1
vintage 9.1
worker 8.9
metal 8.8
factory 8.7
grunge 8.5
musical instrument 8.5
black 8.4
fastener 8.3
safety 8.3
warehouse 8.2
tool 8
chair 8
telephone 7.8
scene 7.8
empty 7.7
outside 7.7
restraint 7.2
history 7.2
steel 7.1
working 7.1
travel 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 93.4
building 82.4
wooden 78.4
black 65.7
old 59.4
house 42.9
farm building 25.9

Color Analysis

Feature analysis

Amazon

Wheel 95.2%
Person 74.9%
Train 62.2%

Categories

Text analysis

Amazon

CASE