Human Generated Data

Title

Untitled (Maynardville, Tennessee)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3481

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Maynardville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3481

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.7
Hat 99.7
Hardhat 99.5
Helmet 99.5
Person 99.4
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Brick 98.5
Person 97.1
Pants 96.6
People 91.5
Architecture 88.1
Building 88.1
Housing 88.1
House 87.6
Porch 87.6
Face 77.2
Head 77.2
Photography 77.2
Portrait 77.2
Cap 76.3
Sitting 73.3
Footwear 73.1
Shoe 73.1
Coat 70.7
Shoe 64.4
Bench 56.6
Furniture 56.6
Door 56.2
Urban 55.4

Clarifai
created on 2018-05-10

people 100
group together 99.1
group 98.8
adult 98.6
man 96.7
four 93.7
street 93.6
two 93.1
administration 92.9
several 92.6
woman 92.5
child 91.1
three 90.4
war 90.2
home 89.4
five 89.2
police 88.2
military 86.4
many 85.8
monochrome 84

Imagga
created on 2023-10-06

kin 29.7
man 21.5
people 21.2
city 17.5
building 16.7
male 16.6
black 14.6
architecture 14.2
person 12.6
old 12.5
tourism 11.5
world 11.4
walking 11.4
outdoors 11.3
couple 11.3
men 11.2
street 11
adult 11
business 10.9
travel 10.6
portrait 10.4
child 10.3
love 10.3
silhouette 9.9
statue 9.7
businessman 9.7
newspaper 9.6
women 9.5
landmark 9
human 9
support 9
religion 9
urban 8.7
water 8.7
lifestyle 8.7
walk 8.6
monument 8.4
life 8.1
history 8
sidewalk 8
wall 8
culture 7.7
sill 7.5
vacation 7.4
product 7.3
together 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 95.2
way 46.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 58-66
Gender Male, 99.9%
Sad 100%
Calm 6.9%
Angry 6.8%
Fear 6.5%
Surprised 6.4%
Disgusted 0.8%
Confused 0.5%
Happy 0.2%

AWS Rekognition

Age 49-57
Gender Male, 99.7%
Angry 75%
Calm 24.2%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0.4%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 36-44
Gender Female, 51.7%
Confused 56%
Sad 21.4%
Calm 19.8%
Surprised 7.6%
Fear 6.1%
Happy 1.6%
Angry 0.9%
Disgusted 0.8%

Microsoft Cognitive Services

Age 44
Gender Male

Feature analysis

Amazon

Person 99.4%
Adult 99.2%
Male 99.2%
Man 99.2%
Shoe 73.1%
Coat 70.7%