Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.951

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.951

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Worker 100
Construction 99.9
Person 99
Adult 99
Male 99
Man 99
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Outdoors 92.5
Machine 92.5
Wheel 92.5
Face 83.9
Head 83.9
Gun 76.3
Weapon 76.3
Clothing 74.3
Hat 74.3
Nature 72.6
Concrete 65.3
Sword 59
Gravel 56.9
Road 56.9
Agriculture 56.6
Countryside 56.6
Field 56.6
Oilfield 55.7
Baseball Cap 55
Cap 55

Clarifai
created on 2018-05-11

people 99.8
vehicle 99.4
adult 99.4
one 98.7
man 97.3
group together 97.1
two 96.1
transportation system 95.2
group 93.5
military 93.4
watercraft 91.9
wear 91.8
veil 89.8
war 88.1
three 87.5
bucket 85.2
recreation 84.9
driver 82.8
monochrome 81.6
weapon 80.7

Imagga
created on 2023-10-06

harp 38.7
vehicle 27.2
machine 26.4
stringed instrument 22.6
bulldozer 20.5
sky 20.4
sand 19.1
people 19
man 18.1
construction 18
person 17.8
wheeled vehicle 17.1
industry 17.1
transportation 17
equipment 16.9
adult 16.8
outdoors 16.5
work 16.5
male 16.3
working 15.9
backhoe 15.3
musical instrument 15.3
machinery 14.6
tractor 14.2
earth 14
excavator 13.8
lifestyle 13.7
skateboard 12.8
industrial 12.7
laptop 12.5
outdoor 12.2
sitting 12
men 12
one 11.9
transport 11.9
activity 11.6
summer 11.6
heavy 11.4
device 11.4
iron 11.4
car 11.3
site 11.3
outside 11.1
cannon 11.1
soil 10.8
board 10.8
water 10.7
worker 10.7
business 10.3
power 10.1
shovel 10
digger 9.9
dig 9.9
notebook 9.5
dirt 9.5
building 9.5
relax 9.3
tool 9.1
relaxing 9.1
road 9
sunset 9
fun 9
computer 9
landscape 8.9
sun 8.9
job 8.8
bucket 8.8
happy 8.8
women 8.7
track 8.6
happiness 8.6
clouds 8.4
power shovel 8.4
park 8.4
leisure 8.3
alone 8.2
active 8.1
looking 8
world 8
yellow 7.9
rural 7.9
mover 7.9
sea 7.8
gun 7.8
portrait 7.8
attractive 7.7
casual 7.6
down 7.6
technology 7.4
action 7.4
vacation 7.4
smiling 7.2
home 7.2
smile 7.1
truck 7.1
businessman 7.1
day 7.1
self-propelled vehicle 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 97.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Male, 100%
Calm 90.6%
Confused 7.1%
Surprised 6.4%
Fear 5.9%
Sad 2.5%
Angry 0.5%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-33
Gender Female, 51.5%
Calm 97.4%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Happy 0.3%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Adult 99%
Male 99%
Man 99%
Wheel 92.5%
Gun 76.3%
Sword 59%

Captions

Text analysis

Amazon

they