Human Generated Data

Title

Untitled (Hammond, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1729

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Hammond, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1729

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Architecture 100
Building 100
Countryside 100
Hut 100
Nature 100
Outdoors 100
Rural 100
Shack 100
Shelter 100
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Person 96.8
Housing 87.4
Face 75.6
Head 75.6
House 74
Photography 68.9
Portrait 68.9
Cabin 62.4

Clarifai
created on 2018-05-11

people 99.9
adult 98.3
one 97.4
group 96.4
man 95.9
two 95.8
log cabin 94
wear 93.6
child 93.2
shed 90.3
three 89.4
vehicle 89.4
family 88.1
home 88.1
transportation system 86
barn 85.9
woman 84.7
group together 84.3
vintage 83.9
war 83

Imagga
created on 2023-10-07

old 37.6
pay-phone 32.4
building 29
telephone 25
architecture 24.3
house 23.4
wall 22.5
door 20.4
electronic equipment 19.6
structure 18.8
city 16.6
equipment 15.4
wood 15
ancient 13.8
abandoned 13.7
history 13.4
window 13.4
vintage 13.2
stone 12.6
weathered 12.3
wooden 12.3
street 12
grunge 11.9
dirty 11.7
empty 11.2
exterior 11.1
travel 10.6
antique 10.4
home 10.4
brick 10.1
aged 9.9
newspaper 9.7
texture 9.7
rural 9.7
metal 9.6
urban 9.6
scene 9.5
dark 9.2
garage 9
hovel 9
outdoors 9
detail 8.8
ruins 8.8
decay 8.7
rust 8.7
black 8.6
construction 8.6
outdoor 8.4
sky 8.3
tourism 8.2
village 8.1
ruin 7.8
broken 7.7
industry 7.7
call 7.4
town 7.4
barn 7.3
historic 7.3
tourist 7.2
people 7.2
landmark 7.2
product 7.2

Google
created on 2018-05-11

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 30-40
Gender Male, 97.2%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.6%
Confused 0.4%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 2-8
Gender Female, 96.6%
Calm 37.7%
Fear 22.4%
Angry 11.5%
Sad 10.5%
Surprised 8.9%
Disgusted 7%
Confused 3.9%
Happy 1.1%

AWS Rekognition

Age 29-39
Gender Male, 96.7%
Sad 60.3%
Angry 39.2%
Fear 15.1%
Calm 8.3%
Surprised 6.9%
Confused 3.2%
Disgusted 1.9%
Happy 0.9%

Microsoft Cognitive Services

Age 31
Gender Male

Feature analysis

Amazon

Adult 98.8%
Male 98.8%
Man 98.8%
Person 98.8%