Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.940

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.940

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Machine 99.3
Wheel 99.3
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Person 96.3
Wheel 96.1
Outdoors 95.2
Nature 94.9
Wheel 91.5
Countryside 83.9
Wheel 79.5
Face 76.5
Head 76.5
Tractor 73.5
Transportation 73.5
Vehicle 73.5
Rural 65
Tire 57.1
Farm 57

Clarifai
created on 2018-05-11

vehicle 99.8
transportation system 97.7
tractor 95.9
people 94
machine 93.6
cropland 92.8
group together 92.1
two 91.1
adult 88
one 88
military 88
war 86.8
group 80.7
three 79.5
industry 77
tank 76.7
four 76.4
farming 76.1
military vehicle 75.9
no person 75.4

Imagga
created on 2023-10-07

machine 78.8
vehicle 76
farm machine 62.1
truck 47.8
thresher 47.1
tractor 43.7
harvester 38.1
plow 32.7
device 31.3
machinery 31.2
farm 30.4
wheel 30.2
equipment 30.2
steamroller 29.4
agriculture 27.2
rural 26.5
industry 25.7
field 25.1
motor vehicle 24.9
transportation 24.2
heavy 23.9
transport 23.8
work 23.6
tool 22.7
engine 22.2
old 20.9
farming 19.9
tire 19.7
car 19.4
sky 19.2
industrial 19.1
fire engine 16.8
wheeled vehicle 16.7
farmer 16.7
agricultural 16.6
driving 16.5
conveyance 16.4
working 15.1
landscape 14.9
earth 14.6
construction 14.6
power 14.3
harvest 14.1
half track 13.4
yellow 13.3
harvesting 12.7
grass 12.7
road 12.7
antique 12.1
hay 12.1
tires 11.8
diesel 11.8
bulldozer 11.8
wheels 11.7
motor 11.6
summer 11.6
drive 11.4
crop 11.3
military vehicle 11.1
locomotive 11.1
land 11.1
tracked vehicle 10.7
farmland 10.7
countryside 10.1
vintage 9.9
job 9.7
steel 9.7
auto 9.6
dirt 9.6
outdoor 9.2
food 9.2
metal 8.9
tow truck 8.8
track 8.7
travel 8.5
site 8.5
iron 8.4
fall 8.2
dirty 8.1
building 7.9
bucket 7.8
soil 7.8
steam 7.8
outside 7.7
automobile 7.7
classic 7.4
grain 7.4
black 7.2
country 7

Microsoft
created on 2018-05-11

outdoor 99.9
grass 99.9
sky 99.8
truck 99.5
old 89.4
tractor 75
farm 55.1
farm machine 47.2
vintage 46.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-51
Gender Female, 91.2%
Surprised 48.2%
Confused 27.3%
Disgusted 18.3%
Calm 10%
Happy 6.5%
Fear 6.5%
Angry 4.5%
Sad 2.6%

Feature analysis

Amazon

Wheel 99.3%
Adult 98.5%
Male 98.5%
Man 98.5%
Person 98.5%

Categories

Imagga

cars vehicles 100%

Text analysis

Amazon

CASE