Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2140

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2140

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

People 99.9
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99
Adult 99
Male 99
Man 99
Person 99
Adult 99
Male 99
Man 99
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 97.8
Adult 97.8
Male 97.8
Man 97.8
Bathing 91.9
Washing 90.2
Outdoors 88.4
Nature 72.6
Face 61.7
Head 61.7
Garden 61.2
Bucket 57.8
Clothing 57.8
Hat 57.8
Gardener 56.5
Gardening 56.5
Soil 56.5
Tub 55.3
Pottery 55.2
Cleaning 55

Clarifai
created on 2018-05-10

people 99.9
group together 99.3
group 99.2
adult 98.8
man 98.5
many 97.4
woman 96.5
several 96.2
four 94.7
five 92.6
recreation 91.8
child 90.6
three 90.5
military 87.5
war 86.9
bucket 86.2
sit 83.8
boy 81.6
wear 79.6
family 78.4

Imagga
created on 2023-10-07

potter's wheel 23.8
statue 23.8
sculpture 22.1
wheel 21.9
container 17.6
architecture 15.6
outdoors 15.2
man 14.9
travel 14.1
vessel 14
people 13.9
old 13.9
kin 13.7
machine 13.7
monument 12.1
milk can 12
art 11.9
world 11.8
history 11.6
tourism 11.5
ancient 11.2
famous 11.2
culture 11.1
religion 10.8
building 10.6
sitting 10.3
stone 10.3
male 10
outdoor 9.9
landmark 9.9
soldier 9.8
portrait 9.7
military 9.7
mechanical device 9.5
religious 9.4
can 9.3
city 9.1
park 9.1
seller 8.8
bucket 8.6
child 8.6
historical 8.5
traditional 8.3
fountain 8.2
lifestyle 7.9
adult 7.9
person 7.8
palace 7.7
two 7.6
sport 7.6
leisure 7.5
historic 7.3
black 7.2
bench 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.3
tree 98.9
person 97.6
people 67.4
group 62.5
white 62.4
old 46.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 23-33
Gender Female, 65.9%
Sad 83.6%
Disgusted 56.3%
Surprised 6.4%
Fear 6.1%
Calm 2.2%
Angry 1.5%
Confused 0.7%
Happy 0.2%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 42.1%
Angry 29%
Disgusted 19.8%
Surprised 7.4%
Fear 6.4%
Sad 3.3%
Confused 2%
Happy 0.6%

Microsoft Cognitive Services

Age 32
Gender Male

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%

Categories