Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2141

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2141

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Pants 100
People 99.9
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Neighborhood 99.3
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99
Adult 99
Male 99
Man 99
Person 98.5
Animal 97.7
Canine 97.7
Dog 97.7
Mammal 97.7
Pet 97.7
Person 97.6
Face 95.4
Head 95.4
Photography 95.4
Portrait 95.4
Person 93.6
Vest 91.7
Outdoors 89.8
Person 85.9
Nature 84
Person 83.8
Walking 76.3
Yard 67.7
Person 67.2
Architecture 62.8
Building 62.8
Shelter 62.8
Hat 61.7
Jeans 60.3
Footwear 58.4
Shoe 58.4
Hound 57.9
Coat 57.8
Housing 57.1
Soil 56.8
Plant 56.3
Tree 56.3
Lifejacket 55.8
Brick 55.7
Hunting 55.7
Golden Retriever 55.4
Jacket 55.4
Firearm 55.4
Gun 55.4
Rifle 55.4
Weapon 55.4
Grass 55.4
Shorts 55.1
Backyard 55

Clarifai
created on 2018-05-10

people 99.9
group 99.2
group together 99
many 98
adult 96.3
man 95.9
administration 95.7
child 91.6
military 89.4
woman 89.3
several 87.5
war 84.8
outfit 83.1
wear 82.5
five 81.8
police 80.9
offense 80.5
law 80
canine 79.4
crowd 78.6

Imagga
created on 2023-10-06

weapon 30.4
man 28.9
people 27.9
sword 26.3
world 18
person 17.9
kin 17.8
male 17.7
adult 16.6
walking 16.1
city 15.8
street 15.6
outdoor 14.5
military 13.5
old 13.2
couple 13.1
pedestrian 11.5
men 11.2
two 11
sport 10.9
soldier 10.8
walk 10.5
child 10.1
protection 10
danger 10
travel 9.9
family 9.8
together 9.6
war 9.6
urban 9.6
boy 9.6
standing 9.6
uniform 9.5
outdoors 9.2
tourism 9.1
portrait 9.1
fun 9
mask 8.7
love 8.7
day 8.6
clothing 8.6
girls 8.2
dirty 8.1
history 8
mother 8
women 7.9
protective 7.8
statue 7.7
tourist 7.7
culture 7.7
human 7.5
life 7.5
holding 7.4
black 7.3
active 7.3
industrial 7.3
gun 7.3
group 7.3
lifestyle 7.2
religion 7.2
to 7.1
architecture 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.7
building 99.3
person 97.4
standing 77.7
group 72.7
people 67.6
clothes 18.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 62.2%
Angry 34.7%
Surprised 6.4%
Fear 5.9%
Sad 2.6%
Confused 0.8%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 40-48
Gender Male, 100%
Happy 95.3%
Surprised 6.5%
Fear 6.1%
Calm 2.3%
Sad 2.3%
Angry 0.4%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Sad 97.4%
Disgusted 33.3%
Confused 11.1%
Surprised 6.6%
Fear 6.2%
Calm 1%
Angry 0.9%
Happy 0.3%

AWS Rekognition

Age 64-74
Gender Male, 99.9%
Confused 58.1%
Calm 27.7%
Sad 9.8%
Surprised 6.6%
Fear 5.9%
Disgusted 1.5%
Angry 0.7%
Happy 0.1%

AWS Rekognition

Age 7-17
Gender Female, 97.7%
Fear 92.6%
Disgusted 8.2%
Surprised 7.2%
Calm 5.8%
Angry 4.3%
Sad 2.2%
Confused 0.4%
Happy 0.3%

AWS Rekognition

Age 22-30
Gender Male, 98.4%
Sad 100%
Surprised 6.3%
Fear 5.9%
Disgusted 1.5%
Angry 0.4%
Calm 0.3%
Confused 0.2%
Happy 0%

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 65
Gender Male

Microsoft Cognitive Services

Age 52
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%
Dog 97.7%
Shoe 58.4%