Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.915

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.915

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 99.1
Male 99.1
Man 99.1
Person 99.1
War 98.7
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98
Male 98
Man 98
Person 98
Male 97.9
Person 97.9
Boy 97.9
Child 97.9
Adult 97.5
Male 97.5
Man 97.5
Person 97.5
Weapon 95.9
Artillery 94.1
Face 81.3
Head 81.3
Clothing 66.3
Aircraft 61.8
Airplane 61.8
Transportation 61.8
Vehicle 61.8
Hat 57.4

Clarifai
created on 2018-05-11

people 99.9
group together 99.4
military 99.1
group 98.9
adult 98.9
soldier 98.7
war 98.3
skirmish 96.6
vehicle 96.3
many 95.8
weapon 95.6
gun 94.1
man 93.8
five 93.7
several 93.5
uniform 93.2
three 89.3
wear 89.2
aircraft 88.9
two 88.2

Imagga
created on 2023-10-07

cannon 100
gun 87.2
artillery 77.8
weapon 68.2
field artillery 53.2
armament 51.2
weaponry 42.3
military 37.7
high-angle gun 33.6
war 33.3
tank 26.7
army 26.3
soldier 24.4
sky 23.6
battle 23.5
camouflage 21.6
history 20.6
danger 19.1
industry 17.1
power 16.8
male 16.3
vehicle 15.8
man 15.5
machine 14.8
building 14.3
monument 14
city 13.3
rifle 13.1
smoke 13
protection 12.7
industrial 12.7
conflict 12.7
defense 12.7
gunnery 12.5
equipment 12.2
construction 12
historic 11.9
old 11.8
barrel 11.8
architecture 11.7
protect 11.5
mask 11.5
forces 10.8
world 10.6
heavy 10.5
outdoors 10.4
warfare 9.9
armor 9.8
combat 9.8
fighting 9.8
destruction 9.8
steel 9.7
steam 9.7
metal 9.7
statue 9.5
clothing 9.3
tourism 9.1
adult 9.1
environment 9
landmark 9
armed 8.8
smog 8.8
disaster 8.8
museum 8.7
nuclear 8.7
victory 8.7
fight 8.7
gas 8.7
track 8.7
arms 8.6
wheel 8.5
travel 8.4
safety 8.3
person 8.3
tower 8.1
turret 7.9
mission 7.9
accident 7.8
protective 7.8
target 7.8
fog 7.7
chemical 7.7
historical 7.5
military vehicle 7.2
dirty 7.2
suit 7.2
transportation 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 95.2
person 87.9
gun 74.8
old 69.4
white 61
weapon 53.8
vintage 35.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-12
Gender Male, 99.8%
Disgusted 47.4%
Angry 32.9%
Sad 13.8%
Surprised 6.5%
Fear 6.3%
Confused 2.4%
Calm 1.6%
Happy 0.4%

AWS Rekognition

Age 23-31
Gender Male, 67.8%
Calm 85.2%
Surprised 10.6%
Fear 6%
Disgusted 3.7%
Sad 2.8%
Angry 1.3%
Confused 0.5%
Happy 0.5%

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Boy 97.9%
Child 97.9%
Airplane 61.8%
Hat 57.4%

Captions

Microsoft
created on 2018-05-11

a vintage photo of a man 91.3%
a black and white photo of a man 85.5%
an old photo of a man 85.4%

Text analysis

Amazon

7
ALB
by
so by
so
AMRED
AMRED RIVER SPECIALITY
A.M.NOB
RIVER SPECIALITY
CHI