Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1905

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1905

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

People 100
Walking 99.9
Road 99.1
Tarmac 99.1
Person 98.2
Person 98
Adult 98
Male 98
Man 98
Person 97.3
Adult 97.3
Male 97.3
Man 97.3
Person 95.4
Person 94.7
Person 93.2
Person 91.5
Person 91.1
Person 90.8
Person 88.6
Person 88.6
Person 88.5
Person 88.2
Outdoors 85.8
Person 85
Person 84.3
Person 84
Person 80.9
Person 80.3
Person 77.4
Person 76.1
Person 75
Person 74.1
Person 73.1
Nature 71.7
Person 70
Person 68.6
Person 68.5
Person 67.1
Person 65
Person 63.1
City 62.6
Street 62.6
Urban 62.6
Pedestrian 60.5
Person 60.3
Head 60.1
Light 59.4
Traffic Light 59.4
Person 57.4
Cross 57.4
Symbol 57.4
Clothing 56.9
Coat 56.9
Architecture 56.6
Building 56.6
Shelter 56.6
Utility Pole 56.6
Neighborhood 56.4
Countryside 56.1
Hat 55.4
Zebra Crossing 55.3
Postal Office 55

Clarifai
created on 2018-05-11

people 99.9
many 99.3
group 99
group together 98.5
crowd 96.8
administration 96.8
adult 96.1
war 94.5
military 94.2
man 92.9
soldier 92.6
leader 88.2
woman 87.9
wear 86
child 85
ceremony 84.3
law 83.5
police 81.9
home 81.8
uniform 81.5

Imagga
created on 2023-10-06

city 29.9
architecture 25.1
street 24.8
flagpole 24.7
urban 22.7
building 20.1
staff 19.8
travel 16.9
people 16.7
pedestrian 15.8
stick 15.1
clothing 14.3
road 12.6
sky 12.1
wall 12
winter 11.9
house 11.8
mortarboard 11.7
academic gown 11.7
flag 11.6
town 11.1
patriot 11.1
lamp 10.6
crowd 10.6
england 10.5
construction 10.3
man 10.1
snow 10
business 9.7
station 9.6
walking 9.5
buildings 9.4
window 9.4
gown 9.2
outdoor 9.2
silhouette 9.1
tourism 9.1
old 9
cap 9
world 8.7
scene 8.6
brick 8.5
square 8.1
black 8.1
transportation 8.1
hall 7.8
outdoors 7.6
cityscape 7.6
headdress 7.4
consumer goods 7.4
covering 7.4
symbol 7.4
sidewalk 7.3
group 7.2
adult 7.1
day 7.1
modern 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
sky 99.2
person 93.6
group 84.6
people 82.1
old 71.3
crowd 0.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Fear 57.9%
Angry 18.6%
Surprised 14.4%
Sad 6.4%
Calm 6.3%
Confused 3.8%
Disgusted 3.3%
Happy 3.2%

AWS Rekognition

Age 27-37
Gender Male, 96.1%
Happy 61.7%
Calm 26.7%
Surprised 8.4%
Fear 6.1%
Sad 3.2%
Angry 2.9%
Disgusted 0.7%
Confused 0.6%

AWS Rekognition

Age 21-29
Gender Male, 52%
Calm 55.5%
Fear 30.9%
Surprised 6.6%
Happy 5.6%
Sad 4%
Angry 2.9%
Disgusted 1%
Confused 0.4%

AWS Rekognition

Age 45-53
Gender Male, 98.6%
Happy 49.1%
Calm 25.1%
Disgusted 11.8%
Surprised 7.3%
Fear 6.7%
Angry 4.4%
Sad 4.2%
Confused 0.7%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Happy 90.6%
Surprised 6.6%
Fear 6.4%
Angry 3.1%
Sad 2.3%
Disgusted 2.2%
Confused 1.3%
Calm 0.5%

AWS Rekognition

Age 14-22
Gender Male, 87.9%
Calm 38.9%
Happy 24%
Angry 14.1%
Confused 9.9%
Surprised 8%
Fear 6.6%
Sad 4.3%
Disgusted 3.1%

AWS Rekognition

Age 34-42
Gender Male, 61%
Sad 77.8%
Fear 36.6%
Calm 15.4%
Happy 8.1%
Surprised 6.5%
Angry 4.8%
Disgusted 1.7%
Confused 0.9%

AWS Rekognition

Age 20-28
Gender Male, 86.2%
Calm 82.4%
Surprised 6.6%
Fear 6.5%
Angry 5.3%
Sad 3.8%
Confused 2.5%
Happy 2%
Disgusted 1.4%

AWS Rekognition

Age 23-31
Gender Male, 98.8%
Calm 82.4%
Fear 11.4%
Surprised 6.5%
Sad 3.5%
Disgusted 1.3%
Happy 0.7%
Confused 0.4%
Angry 0.3%

Feature analysis

Amazon

Person 98.2%
Adult 98%
Male 98%
Man 98%
Hat 55.4%