Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.961

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.961

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

People 99.7
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99
Adult 99
Male 99
Man 99
Person 99
Adult 99
Male 99
Man 99
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98
Adult 98
Male 98
Man 98
Washing 95.1
Bathing 92.8
Outdoors 85.5
Nature 68.2
Head 64.9
Bucket 60.9
Face 57.7
Clothing 56.6
Shorts 56.6
Hat 56.1
Garden 55.7
Soil 55.3

Clarifai
created on 2018-05-11

people 100
group together 99.5
group 99.3
adult 99.2
man 98.7
many 98.1
several 97.8
woman 96.6
four 96.4
five 94.2
recreation 92.9
child 91.8
three 91.8
military 91
administration 90
sit 85.7
wear 84.6
leader 84.5
war 84.3
boy 83.4

Imagga
created on 2023-10-05

potter's wheel 30.2
wheel 27.7
washboard 23.2
outdoors 19.7
people 18.9
device 18.4
man 18.1
machine 17.4
person 17.2
sculpture 15.3
statue 14.9
male 14.2
sitting 13.7
container 13.6
child 13.5
adult 13.1
old 12.5
vessel 12.3
mechanical device 12.1
happy 11.9
smiling 11.6
family 11.6
boy 11.3
lifestyle 10.8
art 10.5
seller 10.5
portrait 10.3
children 10
mother 9.9
religion 9.9
cheerful 9.7
together 9.6
love 9.5
men 9.4
architecture 9.4
religious 9.4
park 9.1
kid 8.9
military 8.7
kin 8.6
stone 8.5
two 8.5
bench 8.4
clothing 8.3
fun 8.2
dirty 8.1
work 8.1
home 8
world 7.9
couple 7.8
smile 7.8
mask 7.8
soldier 7.8
outdoor 7.6
garden 7.5
monument 7.5
holding 7.4
landmark 7.2
black 7.2
little 7.1
day 7.1
bucket 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

tree 99.8
outdoor 99.6
person 97.9
white 64.6
old 56.2
people 56.1
group 56

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Female, 79.3%
Sad 99.8%
Calm 14.5%
Disgusted 9.2%
Surprised 6.5%
Fear 6.3%
Angry 1.7%
Confused 1%
Happy 0.4%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Calm 62.6%
Disgusted 20.5%
Confused 10.9%
Surprised 7.5%
Fear 6.1%
Sad 2.6%
Happy 0.9%
Angry 0.9%

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%

Categories

Imagga

paintings art 92.8%
people portraits 5.4%