Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1778

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1778

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Nature 99.5
Outdoors 99.5
Machine 99.2
Wheel 99.2
Countryside 98.6
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Wheel 97.9
Person 95
Rural 93.9
Wheel 88.8
Farm 84.3
Wheel 75.9
Transportation 72.9
Vehicle 72.9
Car 72.1
Face 66.4
Head 66.4
Wheel 66
Tractor 58
Farm Plow 56.5
Tire 56.3

Clarifai
created on 2018-05-11

vehicle 99.4
tractor 97.9
machine 97.5
transportation system 97.4
cropland 89.1
people 85.3
industry 83.6
engine 82.6
truck 80.6
soil 80.1
tank 79.9
farming 76.7
agriculture 75.3
plow 72
machinery 71.6
two 70.9
driver 69.9
war 69.2
no person 68.2
military 67.8

Imagga
created on 2023-10-06

machine 91.7
plow 80.5
farm machine 64.3
thresher 61
tool 58.3
vehicle 53.1
tractor 51.8
machinery 41
equipment 36.7
farm 35.8
device 35.4
agriculture 32.5
truck 32.1
wheel 31.2
rural 29.1
industry 29.1
heavy 27.7
transportation 26
harvester 26
field 26
work 25.2
car 24.3
transport 23.8
driving 23.2
industrial 22.7
tire 22.6
working 21.3
old 20.9
farming 20
agricultural 19.5
farmer 19.4
power 19.3
engine 19.3
construction 18.9
sky 17.9
harvest 17
wheels 16.6
earth 16.5
grass 15.8
diesel 15.7
harvesting 15.7
landscape 15.6
yellow 15.3
motor vehicle 14.6
dirt 14.3
crop 14.1
land 13.8
tires 13.8
bulldozer 13.6
motor 13.6
road 12.7
summer 12.2
hay 11.7
auto 11.5
shovel 10.8
bucket 10.8
job 10.6
countryside 10.1
dirty 10
vintage 9.9
food 9.9
scoop 9.9
soil 9.8
farmland 9.7
wheat 9.7
building 9.5
drive 9.5
site 9.4
seeder 9.2
antique 8.9
model t 8.9
track 8.7
action 8.4
grain 8.3
new 8.1
metal 8.1
steel 8
country 7.9
combine 7.9
mover 7.9
digging 7.9
hydraulic 7.9
wheeled vehicle 7.9
outdoor 7.7
mechanism 7.4
speed 7.3
mechanical device 7.3
fall 7.3
black 7.2
worker 7.1
growth 7
autumn 7

Microsoft
created on 2018-05-11

grass 100
sky 99.9
outdoor 99.9
truck 99.3
old 88.3
field 79.4
farm 56.9
vintage 42.7
farm machine 36.2
antique 25.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 45-53
Gender Female, 92.5%
Calm 38.7%
Surprised 28%
Angry 24.3%
Confused 7.9%
Fear 6.2%
Disgusted 5.4%
Sad 2.5%
Happy 1.2%

AWS Rekognition

Age 26-36
Gender Male, 99.2%
Sad 98.6%
Calm 39.2%
Surprised 6.7%
Fear 6.2%
Confused 1.3%
Disgusted 0.6%
Angry 0.4%
Happy 0.3%

Microsoft Cognitive Services

Age 24
Gender Male

Feature analysis

Amazon

Wheel 99.2%
Adult 98.2%
Male 98.2%
Man 98.2%
Person 98.2%
Car 72.1%

Categories

Imagga

cars vehicles 99.3%

Text analysis

Amazon

CASE