Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2203

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2203

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

People 99.6
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99
Adult 99
Bride 99
Female 99
Wedding 99
Woman 99
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Photography 97.4
Person 96.1
Person 93.7
Outdoors 90.9
Nature 83.7
Face 81.4
Head 81.4
Back 78.7
Body Part 78.7
Clothing 74.4
Jeans 74.4
Pants 74.4
Person 72.7
Jeans 70.2
Hat 57.6
Plant 57.6
Tree 57.6
Architecture 56.8
Building 56.8
Shelter 56.8
Dress 56.5
Formal Wear 56.5
Baseball 56.3
Baseball Glove 56.3
Glove 56.3
Sport 56.3
Countryside 55.9
Portrait 55.6
Shorts 55.5
Firearm 55.4
Gun 55.4
Rifle 55.4
Weapon 55.4

Clarifai
created on 2018-05-10

people 99.9
group 99
group together 98.8
adult 98
man 97.9
woman 95.8
child 95
many 94.4
several 93.3
family 88.4
war 87.8
five 86.8
wear 86.5
military 85.9
boy 85.8
leader 85.6
administration 85.5
soldier 83
home 80.7
four 76.8

Imagga
created on 2023-10-07

people 22.9
old 18.8
person 18.5
world 18.1
man 18.1
winter 14.5
male 12.8
snow 12.8
adult 12.7
religion 12.5
travel 12
city 11.6
kin 11.5
statue 10.8
history 10.7
tourism 10.7
musical instrument 10.6
military 10.6
war 10.6
child 10.5
weapon 10.4
uniform 10.4
religious 10.3
men 10.3
clothing 10.2
marimba 10
group 9.7
percussion instrument 9.6
couple 9.6
two 9.3
tradition 9.2
outdoor 9.2
outdoors 9.1
mother 8.7
cold 8.6
architecture 8.6
walking 8.5
building 8.5
fan 8.3
tourist 8.3
historic 8.3
family 8
together 7.9
portrait 7.8
walk 7.6
life 7.5
traditional 7.5
street 7.4
holiday 7.2
women 7.1
love 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.9
outdoor 95.5
standing 91
people 91
group 86.4
black 75.2
posing 70.7
old 55.8
family 30.6
crowd 0.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Male, 99.8%
Happy 97.2%
Surprised 6.5%
Fear 5.9%
Sad 2.3%
Disgusted 0.9%
Angry 0.5%
Calm 0.2%
Confused 0.1%

AWS Rekognition

Age 18-26
Gender Male, 99.7%
Calm 90.7%
Surprised 6.5%
Fear 6%
Sad 3.8%
Happy 1.6%
Confused 1.4%
Angry 0.7%
Disgusted 0.7%

AWS Rekognition

Age 59-67
Gender Male, 99.9%
Surprised 74.5%
Confused 49.9%
Fear 6.2%
Sad 2.7%
Calm 1.6%
Disgusted 1%
Angry 0.7%
Happy 0.2%

AWS Rekognition

Age 22-30
Gender Male, 100%
Sad 100%
Surprised 6.3%
Fear 5.9%
Confused 0.1%
Calm 0.1%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 18-26
Gender Male, 97.1%
Sad 99.7%
Confused 15%
Fear 9.2%
Surprised 8.7%
Calm 2.6%
Angry 1.9%
Disgusted 0.7%
Happy 0.5%

AWS Rekognition

Age 19-27
Gender Female, 94.7%
Fear 57.6%
Calm 28.4%
Sad 10.1%
Happy 9.3%
Surprised 6.7%
Angry 2.1%
Confused 1%
Disgusted 0.9%

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%
Bride 99%
Female 99%
Woman 99%
Jeans 74.4%