Human Generated Data

Title

Untitled (Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1146

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1146

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.7
Female 99.7
Person 99.7
Woman 99.7
Face 98.8
Head 98.8
Photography 98.8
Portrait 98.8
Person 96.2
Person 94.8
Text 87.5
Person 83
Person 77
Newspaper 75.9
Person 71.1
Person 71
Newsstand 67.3
Shop 67.3
Person 60.1
Blouse 57
Clothing 57
Art 55.9
Collage 55.9

Clarifai
created on 2018-05-11

people 100
one 99.4
adult 99.1
group 98.9
two 98.4
man 95.8
group together 95.5
woman 94.5
wear 92.4
administration 92.3
three 91.8
child 90.9
war 90.8
room 90.4
commerce 89.7
street 89.4
vehicle 89.1
portrait 88.9
monochrome 88.6
several 88.5

Imagga
created on 2023-10-06

newspaper 37.6
product 28.2
creation 22
person 21.7
man 21.6
male 18.4
negative 18.3
people 17.8
portrait 16.2
old 16
film 13.1
adult 13
black 12.7
city 12.5
urban 11.4
dress 10.8
men 10.3
women 10.3
world 10
vintage 9.9
fashion 9.8
art 9.8
musical instrument 9.6
comedian 9.6
building 9.6
ancient 9.5
wall 9.4
photographic paper 9.4
performer 9.3
dirty 9
one 9
businessman 8.8
mask 8.8
cold 8.6
sculpture 8.6
room 8.6
face 8.5
grunge 8.5
business 8.5
street 8.3
silhouette 8.3
human 8.2
religion 8.1
history 8
decoration 8
working 8
work 7.8
stone 7.8
statue 7.7
violin 7.7
culture 7.7
house 7.5
happy 7.5
style 7.4
alone 7.3
looking 7.2
stringed instrument 7.1
architecture 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 88.4
old 68.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 21-29
Gender Female, 87.5%
Calm 88.8%
Fear 9.4%
Surprised 6.8%
Sad 2.5%
Happy 0.6%
Angry 0.4%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 23-31
Gender Female, 97.2%
Sad 100%
Calm 7.6%
Surprised 6.3%
Fear 5.9%
Happy 0.4%
Confused 0.3%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 12-20
Gender Male, 94.8%
Calm 90.2%
Surprised 6.4%
Fear 6.1%
Sad 3.7%
Disgusted 3.3%
Happy 0.9%
Angry 0.4%
Confused 0.4%

Microsoft Cognitive Services

Age 46
Gender Male

Feature analysis

Amazon

Adult 99.7%
Female 99.7%
Person 99.7%
Woman 99.7%

Categories

Imagga

paintings art 99.6%