Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.925

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.925

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Pants 100
People 99.8
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99
Adult 99
Male 99
Man 99
Neighborhood 98.8
Person 98.6
Adult 98.6
Male 98.6
Man 98.6
Animal 97.8
Canine 97.8
Dog 97.8
Mammal 97.8
Pet 97.8
Architecture 97.8
Building 97.8
Outdoors 97.8
Shelter 97.8
Person 97.5
Face 95.4
Head 95.4
Photography 95.4
Portrait 95.4
Nature 91.9
Person 90.9
Person 90.2
Adult 90.2
Bride 90.2
Female 90.2
Wedding 90.2
Woman 90.2
Person 81.9
Vest 80.1
Yard 68.5
Housing 65.4
Coat 64.7
Person 62.9
Snow 60.4
Footwear 57.8
Shoe 57.8
Hat 57.1
Officer 56.8
Hound 56.5
Firearm 56.4
Gun 56.4
Rifle 56.4
Weapon 56.4
Jeans 56.4
Door 56.3
Walking 56.3
House 56.2
Jacket 55.8
Backyard 55.5
Brick 55.5
Lifejacket 55.1
Puppy 55

Clarifai
created on 2018-05-11

people 100
group together 99.4
group 99
many 98.7
administration 97.3
child 96.1
adult 95.7
man 93.5
several 93
military 92.6
war 89.2
outfit 86.7
canine 86.5
wear 84.7
leader 84.3
police 83.8
woman 83.4
five 82.8
offense 81.6
recreation 80.4

Imagga
created on 2023-10-06

weapon 46.4
sword 41.9
man 28.9
people 26.2
adult 19.2
military 17.4
person 17.4
male 15.6
danger 15.5
clothing 15.3
walking 15.1
uniform 14.8
men 14.6
outdoor 14.5
soldier 13.7
protection 13.6
war 13.5
street 12.9
wind instrument 12.8
mask 12.7
sport 12.7
musical instrument 12.6
city 12.5
world 12.1
outdoors 12
brass 11.1
two 11
destruction 10.7
protective 10.7
walk 10.5
gun 10.3
black 10.3
day 10.2
industrial 10
travel 9.9
radioactive 9.8
radiation 9.8
accident 9.8
toxic 9.8
old 9.7
portrait 9.7
nuclear 9.7
chemical 9.7
gas 9.6
urban 9.6
trombone 9.6
holding 9.1
dirty 9
activity 8.9
history 8.9
stalker 8.9
camouflage 8.8
army 8.8
standing 8.7
rifle 8.5
safety 8.3
suit 8.3
fun 8.2
girls 8.2
group 8.1
couple 7.8
disaster 7.8
defense 7.8
protect 7.7
attractive 7.7
industry 7.7
dark 7.5
leisure 7.5
silhouette 7.4
smoke 7.4
active 7.4
environment 7.4
business 7.3
women 7.1
together 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
building 99.4
person 96.3
standing 75.7
people 65.4
group 62.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Male, 99.9%
Calm 87.5%
Surprised 6.5%
Fear 6%
Sad 4.1%
Confused 3.7%
Angry 2.5%
Disgusted 0.6%
Happy 0.4%

AWS Rekognition

Age 60-70
Gender Male, 99.8%
Calm 59.2%
Confused 31%
Sad 7.1%
Surprised 6.5%
Fear 5.9%
Disgusted 0.3%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Happy 94.2%
Surprised 6.7%
Fear 6.3%
Sad 2.4%
Calm 1.8%
Angry 0.6%
Disgusted 0.4%
Confused 0.2%

AWS Rekognition

Age 53-61
Gender Male, 99.9%
Sad 99.8%
Disgusted 17.4%
Confused 7.8%
Surprised 6.6%
Fear 6.1%
Calm 1.5%
Angry 0.9%
Happy 0.3%

AWS Rekognition

Age 6-14
Gender Female, 99.9%
Fear 36%
Calm 35.6%
Surprised 18.9%
Happy 7.8%
Disgusted 3.6%
Sad 2.7%
Angry 2.6%
Confused 1.4%

AWS Rekognition

Age 23-33
Gender Male, 99.8%
Confused 48%
Sad 30.1%
Calm 13%
Surprised 12.8%
Fear 6.1%
Angry 4.9%
Disgusted 3.2%
Happy 0.3%

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 63
Gender Male

Microsoft Cognitive Services

Age 67
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%
Dog 97.8%
Bride 90.2%
Female 90.2%
Woman 90.2%
Shoe 57.8%