Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3066

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3066

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Worker 99.1
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Face 90.2
Head 90.2
Outdoors 76.8
Machine 72.3
Wheel 72.3
People 64.8
Clothing 57.9
Hat 57.9
Agriculture 57.3
Countryside 57.3
Field 57.3
Nature 57.3
Animal 57
Mammal 57
Construction 55.8
Cap 55.3

Clarifai
created on 2018-05-10

people 99.9
adult 99.6
one 97.9
man 97.8
group together 97.8
group 97
two 96.3
wear 95.5
vehicle 94.2
military 93.6
woman 91.9
war 91.9
three 91.3
portrait 90.1
administration 88.7
uniform 87.5
gun 86.9
several 85.9
soldier 85.7
weapon 85.6

Imagga
created on 2023-10-06

person 25
statue 23.2
man 22.8
people 21.2
religion 18.8
sculpture 18.5
male 15.7
adult 15.6
art 15.3
face 14.2
old 13.9
religious 13.1
culture 12.8
city 12.5
hat 12.3
men 12
faith 11.5
portrait 11
dress 10.8
human 10.5
detail 10.5
clothing 10.4
guy 10.2
architecture 10.2
traditional 10
history 9.8
black 9.7
urban 9.6
god 9.6
building 9.5
ancient 9.5
travel 9.1
hair 8.7
couple 8.7
love 8.7
horse 8.5
two 8.5
stone 8.4
monument 8.4
emotion 8.3
fashion 8.3
tourism 8.2
peace 8.2
one 8.2
industrial 8.2
jinrikisha 8
machinist 8
white 7.9
temple 7.9
catholic 7.8
attractive 7.7
hand 7.6
happy 7.5
dark 7.5
church 7.4
street 7.4
smiling 7.2
body 7.2
worker 7.1
smile 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.6
outdoor 99
man 90.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Male, 100%
Angry 99.3%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.4%
Confused 0.1%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Wheel 72.3%

Captions

Microsoft
created on 2018-05-10

a man holding a gun 74.4%
a man holding a baseball bat 28.8%
a man wearing a hat 28.7%