Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2636

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2636

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.2
Human 99.2
Person 99.2
Person 98.1
Clothing 94
Apparel 94
Face 92.2
Car 90.8
Transportation 90.8
Vehicle 90.8
Automobile 90.8
Wood 74.1
Worker 70
People 66.4
Outdoors 63.2
Building 61.8
Hardhat 61.5
Helmet 61.5
Hat 60
Urban 59.9
Carpenter 56.1
Countryside 55.9
Nature 55.9
Beard 55.4

Clarifai
created on 2023-10-15

people 99.9
portrait 99.5
adult 98.5
man 98.3
elderly 98.1
group 97.7
monochrome 96.2
family 96.1
three 96
two 95.2
woman 93.8
child 93.7
son 92.7
old 91.9
group together 90.2
sadness 89.9
facial expression 88.9
four 88.4
documentary 86.2
offspring 83.9

Imagga
created on 2021-12-15

statue 71.3
sculpture 57.1
kin 38
art 28.6
fountain 27.2
architecture 26.6
ancient 25.1
stone 24.5
monument 23.4
history 23.3
old 23
marble 22.5
culture 20.5
world 19.9
religion 19.7
structure 19.3
travel 19
carving 18.5
historic 17.4
tourism 17.3
antique 17.3
historical 17
building 16.8
religious 14.1
landmark 13.6
god 13.4
city 13.3
famous 13
figure 13
detail 12.9
column 12.4
face 12.1
decoration 11.6
man 10.8
vintage 10.8
catholic 10.7
portrait 10.4
head 10.1
people 9.5
cadaver 9.3
church 9.3
temple 9.2
traditional 9.2
statues 8.9
sibling 8.8
carved 8.8
roman 8.8
memorial 8.1
mask 8
home 8
worship 7.7
heritage 7.7
holy 7.7
sky 7.7
outdoor 7.7
disguise 7.6
decorative 7.5
closeup 7.4
tourist 7.3
plastic art 7.2
love 7.1
male 7.1
grave 7
mother 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.2
person 95.3
clothing 93.7
human face 90.9
man 90.6
window 86.7
old 85.4
smile 65.8
black and white 50.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-16
Gender Female, 69.6%
Fear 30.5%
Sad 29.5%
Happy 12.1%
Angry 9.8%
Calm 8.8%
Confused 6.8%
Surprised 1.4%
Disgusted 1.2%

AWS Rekognition

Age 47-65
Gender Male, 97.9%
Sad 74.4%
Calm 24.5%
Angry 0.4%
Confused 0.4%
Fear 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 3-11
Gender Female, 55.4%
Happy 91.2%
Sad 3.1%
Calm 2.1%
Fear 1.1%
Confused 0.9%
Angry 0.8%
Surprised 0.5%
Disgusted 0.4%

Microsoft Cognitive Services

Age 15
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Car 90.8%

Categories

Captions

Microsoft
created on 2021-12-15

an old photo of a man 89.4%
old photo of a man 87.5%
a man sitting in front of a window 60.1%

Text analysis

Amazon

N41763

Google

NA1 763
NA1
763