Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.924

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.924

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

People 100
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99
Adult 99
Male 99
Man 99
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.4
Person 98.3
Adult 98.3
Male 98.3
Man 98.3
Clothing 98.2
Pants 98.2
Person 98.1
Adult 98.1
Male 98.1
Man 98.1
Firearm 98.1
Gun 98.1
Rifle 98.1
Weapon 98.1
Photography 98
Backyard 97.7
Nature 97.7
Outdoors 97.7
Yard 97.7
Person 97.7
Male 97.7
Boy 97.7
Child 97.7
Person 97.3
Face 93.2
Head 93.2
Portrait 93.2
Person 82.1
Grass 67
Plant 67
Hat 65.6
Footwear 61.8
Shoe 61.8
Officer 56.9
Military 56.5
Architecture 56.3
Building 56.3
Shelter 56.3
Military Uniform 55.8
Baseball 55.6
Baseball Glove 55.6
Glove 55.6
Sport 55.6
Photographer 55.3
Countryside 55.2

Clarifai
created on 2018-05-11

people 100
group 99.3
group together 99.2
adult 97.9
child 97.6
many 97.2
several 96.4
administration 96.4
military 94.3
man 93.3
war 92.8
leader 92.7
offspring 91.3
five 89.7
boy 87.2
soldier 86.3
woman 86.2
outfit 85.5
sibling 81.4
home 81

Imagga
created on 2023-10-05

kin 55.9
world 41.5
statue 26.9
sculpture 22.9
people 18.4
man 18.1
old 17.4
person 17.4
architecture 16.4
child 16.3
male 14.9
religion 14.3
portrait 14.2
adult 13.8
military 13.5
monument 13.1
danger 12.7
art 12.7
history 12.5
city 12.5
ancient 12.1
love 11.8
protection 11.8
soldier 11.7
mask 11.5
outdoor 11.5
travel 11.3
religious 11.2
culture 11.1
face 10.7
building 10.5
stone 10.3
landmark 9.9
park 9.9
clothing 9.8
outdoors 9.7
war 9.6
antique 9.5
uniform 9.5
historic 9.2
industrial 9.1
camouflage 8.8
toxic 8.8
nuclear 8.7
marble 8.7
god 8.6
two 8.5
mother 8.4
girls 8.2
dirty 8.1
symbol 8.1
detail 8
stalker 7.9
gun 7.9
radioactive 7.9
couple 7.8
happiness 7.8
radiation 7.8
boy 7.8
destruction 7.8
accident 7.8
protective 7.8
sepia 7.8
structure 7.7
chemical 7.7
gas 7.7
palace 7.7
human 7.5
famous 7.4
holding 7.4
street 7.4
lifestyle 7.2
black 7.2
recreation 7.2
fountain 7.2
life 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 98.7
outdoor 97.3
old 41.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 37-45
Gender Male, 97%
Calm 35.5%
Happy 31.8%
Surprised 29.7%
Fear 6.5%
Confused 4.4%
Sad 2.7%
Disgusted 2.4%
Angry 1%

AWS Rekognition

Age 47-53
Gender Male, 99.5%
Sad 89.3%
Disgusted 28.7%
Confused 10.1%
Surprised 7.8%
Fear 7%
Angry 6.7%
Calm 5.6%
Happy 1.7%

AWS Rekognition

Age 61-71
Gender Male, 99.9%
Sad 90.3%
Confused 29.5%
Calm 21.3%
Surprised 6.6%
Fear 6.1%
Disgusted 3.5%
Angry 1.6%
Happy 0.6%

AWS Rekognition

Age 34-42
Gender Female, 87.8%
Fear 95.2%
Surprised 8.3%
Confused 3.7%
Happy 3.4%
Sad 2.3%
Calm 1.7%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 95.2%
Surprised 6.5%
Fear 6.4%
Sad 2.7%
Happy 0.7%
Angry 0.4%
Disgusted 0.3%
Confused 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.7%
Calm 40.8%
Confused 26.3%
Sad 15.5%
Surprised 9.6%
Fear 6.7%
Disgusted 5.6%
Happy 2.9%
Angry 2%

AWS Rekognition

Age 31-41
Gender Male, 83.9%
Disgusted 54.6%
Sad 14.8%
Happy 13.8%
Surprised 8.7%
Fear 7.3%
Confused 5.3%
Calm 2.3%
Angry 2%

AWS Rekognition

Age 18-26
Gender Female, 99.7%
Calm 82.7%
Sad 7.8%
Surprised 6.4%
Fear 6.3%
Happy 3.6%
Angry 1.2%
Disgusted 0.9%
Confused 0.7%

Microsoft Cognitive Services

Age 68
Gender Male

Microsoft Cognitive Services

Age 66
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%
Boy 97.7%
Child 97.7%
Shoe 61.8%

Categories