Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

July 1938-August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3496

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3496

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Machine 99.1
Wheel 99.1
Adult 98
Male 98
Man 98
Person 98
Wheel 97.3
Person 95.5
Wheel 93.5
Outdoors 90.4
Nature 89.7
Wheel 86.1
Transportation 77.6
Vehicle 77.6
Car 76.4
Countryside 75.2
Face 74.4
Head 74.4
Tractor 70.8
Tire 58
Rural 57.8
Spoke 56.1
Farm 55.9
Antique Car 55.9
Model T 55.9
Wheel 55.5

Clarifai
created on 2018-05-10

vehicle 99.7
tractor 97.9
transportation system 97.4
machine 96.2
tank 90.1
people 89.9
war 85.4
military 84.4
two 84.2
group together 83.8
engine 83.5
driver 81.1
cropland 79.5
adult 79.5
soil 78.9
plow 76.9
three 73.3
skirmish 73.1
one 70.3
truck 70.1

Imagga
created on 2023-10-07

thresher 100
farm machine 100
machine 100
device 55
vehicle 49.4
tractor 48.7
plow 44.2
machinery 40
equipment 36.7
farm 33.1
tool 32.4
wheel 31.2
industry 29.9
agriculture 29.9
transportation 27.8
harvester 27.8
rural 27.4
work 26.7
heavy 25.8
field 25.1
truck 24.8
industrial 23.6
transport 22.9
tire 20.7
construction 20.6
working 20.4
driving 20.3
old 20.2
farming 19
car 18.9
farmer 18.8
yellow 17.9
sky 17.9
harvest 17
wheels 16.6
agricultural 16.6
earth 16.5
landscape 16.4
engine 16.4
power 16
grass 15.8
bulldozer 15.4
dirt 15.3
crop 14.1
motor 13.6
land 12.9
tires 12.8
diesel 12.8
harvesting 12.7
hay 12.3
summer 12.2
site 12.2
shovel 11.8
scoop 10.8
bucket 10.8
food 10.7
auto 10.5
countryside 10.1
dirty 10
outdoor 10
road 10
soil 9.8
job 9.7
farmland 9.7
track 9.6
wheat 9.5
building 9.5
drive 9.5
wagon 9.3
vintage 9.1
mover 8.9
metal 8.9
sand 8.7
action 8.4
grain 8.3
fall 8.2
antique 7.9
autumn 7.9
combine 7.9
digging 7.9
hydraulic 7.9
excavator 7.9
powerful 7.8
automobile 7.7
seeder 7.6
ground 7.6
environment 7.4
black 7.2
cut 7.2
worker 7.1
steel 7.1
growth 7
country 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

grass 99.9
outdoor 99.8
truck 99.7
sky 98.9
old 89.5
parked 61.4
farm 52.9
tractor 45.6
vintage 42.3
farm machine 38.7
trailer 19.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 30-40
Gender Female, 93.9%
Confused 72.4%
Calm 12.4%
Surprised 9%
Fear 6.1%
Disgusted 5.5%
Sad 2.7%
Angry 2.2%
Happy 0.7%

AWS Rekognition

Age 29-39
Gender Male, 96%
Calm 99.2%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.3%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 32
Gender Male

Feature analysis

Amazon

Wheel 99.1%
Adult 98%
Male 98%
Man 98%
Person 98%
Car 76.4%

Categories

Text analysis

Amazon

CASE