Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.959

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.959

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Clothing 96
Hat 96
Animal 94.1
Horse 94.1
Mammal 94.1
Outdoors 90.2
Nature 82.1
Person 81.9
Horse 77.5
Face 72.4
Head 72.4
Countryside 72
Transportation 66.2
Vehicle 66.2
Machine 61.6
Wheel 61.6
Rural 57.3
Soil 56.9
Colt Horse 56.8
Wagon 56.7
Garden 56.2
Gardener 56.2
Gardening 56.2
Andalusian Horse 56.1
Sun Hat 55.1

Clarifai
created on 2018-05-11

people 100
adult 99.6
group together 99.2
group 98.8
two 98
man 97.8
vehicle 97.6
one 95.8
three 95.6
watercraft 95.1
war 91.7
wear 91.7
transportation system 90.9
military 90.6
several 90.5
four 90.4
lid 88.1
many 86.7
woman 86.3
child 86.3

Imagga
created on 2023-10-06

tool 28.7
vessel 25.2
shovel 22.3
outdoor 20.6
rake 20.6
swing 19.9
park 16.5
man 15.4
people 15
outdoors 15
bucket 14.5
mechanical device 14.2
container 13.5
old 13.2
cleaner 13.2
plaything 12.9
adult 12.3
person 12.2
mechanism 10.8
water 10.7
male 10.6
work 10.3
hand tool 10.3
grass 10.3
tree 10.2
happy 10
resort area 9.9
child 9.5
boat 9.3
black 9
trees 8.9
working 8.8
play 8.6
holiday 8.6
outside 8.5
industry 8.5
portrait 8.4
garden 8.4
fun 8.2
dirty 8.1
active 8.1
worker 8
travel 7.7
attractive 7.7
area 7.7
barrow 7.7
walk 7.6
wood 7.5
vacation 7.4
cheerful 7.3
playing 7.3
snow 7.3
activity 7.2
happiness 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 95.8
person 92.4
man 91.4
standing 77.6
black 68.9
old 67.5
white 62.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Male, 99.2%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Happy 0%

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Horse 94.1%
Wheel 61.6%

Categories

Captions