Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.974

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.974

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Garden 100
Nature 100
Outdoors 100
Gardener 99.9
Gardening 99.9
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Soil 98.9
Machine 91.3
Wheel 91.3
Clothing 85.2
Coat 85.2
Face 80.8
Head 80.8
Wheel 76.8
Bicycle 73.4
Transportation 73.4
Vehicle 73.4
Grass 57.3
Plant 57.3
Countryside 57.2
Tree 57.2
Agriculture 56.9
Field 56.9
Bathing 56.9
Bucket 56.3
Rural 55.5
Washing 55.2

Clarifai
created on 2018-05-11

people 100
adult 98.9
group 97.8
man 97.2
group together 96.5
two 95.5
three 93.5
administration 93.2
military 91.7
war 90.6
leader 88.7
soldier 86.7
one 86.2
four 86
wear 85.4
golfer 83
many 81.3
woman 80.9
five 79.7
recreation 79.2

Imagga
created on 2023-10-06

barrow 100
handcart 80.7
wheeled vehicle 61.3
vehicle 39.5
man 26.9
conveyance 25.7
people 21.8
outdoors 21.7
male 21.3
outdoor 19.1
child 17.6
person 17.3
tool 16.8
boy 16.5
outside 16.3
grass 15.8
walking 14.2
bench 14.1
old 13.9
sunset 13.5
autumn 13.2
park 13.2
lifestyle 13
farmer 12.9
countryside 11.9
country 11.4
couple 11.3
landscape 11.2
summer 10.9
sky 10.8
happy 10.7
adult 10.4
love 10.3
beach 10.1
field 10
fall 10
rural 9.7
happiness 9.4
garden 9.2
leisure 9.1
girls 9.1
fun 9
family 8.9
kid 8.9
to 8.9
tree 8.8
forest 8.7
stretcher 8.7
day 8.6
elderly 8.6
sitting 8.6
men 8.6
smile 8.6
two 8.5
senior 8.4
silhouette 8.3
holding 8.3
environment 8.2
rake 8.2
children 8.2
lady 8.1
active 8.1
together 7.9
portrait 7.8
play 7.8
hiking 7.7
attractive 7.7
walk 7.6
joy 7.5
vacation 7.4
natural 7.4
meadow 7.2
recreation 7.2
farm 7.1
mountain 7.1
working 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
tree 99.5
person 95.4
man 93.7
old 50.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Happy 87.1%
Surprised 7.5%
Fear 6.4%
Sad 2.8%
Disgusted 2.6%
Confused 2.5%
Calm 1.3%
Angry 0.9%

Microsoft Cognitive Services

Age 56
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Male 99.4%
Man 99.4%
Wheel 91.3%
Bicycle 73.4%

Captions

Microsoft
created on 2018-05-11

a man sitting on a bench 71.9%
a man standing next to a bench 71.8%
an old photo of a man 71.7%