Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1782

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1782

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Nature 99.5
Outdoors 99.5
Machine 99.1
Wheel 99.1
Countryside 98.2
Adult 98
Male 98
Man 98
Person 98
Wheel 97.7
Person 93.8
Rural 92.5
Wheel 88.8
Wheel 85.8
Farm 83.7
Car 80.2
Transportation 80.2
Vehicle 80.2
Face 71.1
Head 71.1
Tractor 67.8
Farm Plow 56.5
Tire 55.6

Clarifai
created on 2018-05-11

vehicle 99.5
tractor 98.9
machine 98.2
transportation system 97.6
cropland 87.2
industry 87.2
truck 87.1
soil 86.5
tank 85.7
machinery 84.7
agriculture 82.9
engine 82.9
farming 79.8
people 79.3
plow 77.7
two 77.2
farm 77
driver 73.7
wheel 71.6
war 70.6

Imagga
created on 2023-10-06

machine 84.8
plow 70
vehicle 61.9
farm machine 53.3
tool 51.5
tractor 51
thresher 50.8
machinery 41
equipment 37.5
truck 35.5
farm 34.9
wheel 33.1
agriculture 32.5
industry 29.9
device 29.6
rural 29.1
heavy 28.7
car 28.3
field 26.8
transportation 26
work 25.2
transport 24.7
harvester 23.7
driving 23.2
engine 23.1
tire 22.6
industrial 21.8
agricultural 21.4
old 20.9
farming 20.9
working 20.4
power 20.2
farmer 20.2
motor vehicle 20.1
sky 18.5
construction 18
harvest 17.9
harvesting 17.6
earth 16.5
diesel 15.7
land 15.7
landscape 15.6
model t 15.5
crop 15.1
yellow 14.6
grass 14.3
half track 14.1
wheels 13.7
wheeled vehicle 13.4
tires 12.8
bulldozer 12.8
road 12.7
motor 12.6
auto 12.5
dirt 12.4
job 11.5
hay 11.4
tracked vehicle 11.2
military vehicle 11.2
summer 11
vintage 10.8
farmland 10.7
countryside 10.1
dirty 10
shovel 9.9
antique 9.8
bucket 9.8
steel 9.7
wheat 9.5
building 9.5
drive 9.5
food 9.2
new 8.9
scoop 8.9
country 8.8
soil 8.8
site 8.5
grain 8.3
metal 8.1
combine 7.9
digging 7.9
track 7.7
automobile 7.7
action 7.4
business 7.3
fall 7.3
black 7.2
seeder 7.1
worker 7.1
growth 7
autumn 7
scenic 7

Microsoft
created on 2018-05-11

sky 100
outdoor 99.9
grass 99.9
truck 99.7
old 87.3
farm 55.8
farm machine 28
trailer 19.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Female, 99.3%
Surprised 56.1%
Disgusted 25.7%
Calm 20%
Confused 10.4%
Fear 6.2%
Angry 4.6%
Happy 3.1%
Sad 2.5%

AWS Rekognition

Age 28-38
Gender Male, 96%
Calm 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Disgusted 0%
Angry 0%
Happy 0%

Feature analysis

Amazon

Wheel 99.1%
Adult 98%
Male 98%
Man 98%
Person 98%
Car 80.2%

Categories

Imagga

cars vehicles 99.2%

Captions

Text analysis

Amazon

CASE