Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1858

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1858

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Hat 100
Sun Hat 99.8
Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Adult 98
Male 98
Man 98
Person 98
Adult 98
Male 98
Man 98
Person 98
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Coat 96.8
Adult 94.7
Male 94.7
Man 94.7
Person 94.7
Overcoat 93.8
Person 89.6
Outdoors 87.3
Photography 82.1
Face 79.8
Head 79.8
Person 73.7
Handrail 73.7
Nature 69.7
Garden 57.9
Gardener 57.9
Gardening 57.9
Cap 56.9
Jacket 56
Agriculture 55.9
Countryside 55.9
Field 55.9
Worker 55.9

Clarifai
created on 2018-05-11

people 99.9
group together 98.8
group 98.5
adult 98.4
man 97.4
military 92.6
three 92.3
several 91.9
war 91.9
two 90.8
leader 90.6
many 89.8
four 89.6
administration 88.6
five 88.2
wear 87.2
woman 86.9
soldier 85.3
uniform 85.2
veil 85

Imagga
created on 2023-10-06

statue 48.3
engineer 31.9
sculpture 28.8
uniform 27.7
military uniform 25
clothing 24.4
monument 24.3
gun 23.1
history 22.4
soldier 21.5
weapon 21.4
military 20.3
architecture 19.6
man 19.5
male 19.2
sky 19.1
city 17.5
equipment 17
rifle 16.9
protection 16.4
culture 16.2
mask 15.7
historic 14.7
danger 14.6
landmark 14.4
old 13.9
war 13.7
building 13.5
tripod 13.3
cannon 12.7
religion 12.5
travel 12
person 11.9
bronze 11.9
camouflage 11.8
gas 11.6
art 11.1
industry 11.1
memorial 11
toxic 10.7
tourism 10.7
protective 10.7
covering 10.3
horizontal 10
consumer goods 10
radioactive 9.8
radiation 9.8
television camera 9.8
disaster 9.8
chemical 9.7
ancient 9.5
construction 9.4
town 9.3
safety 9.2
silhouette 9.1
industrial 9.1
adult 9.1
destruction 8.8
catholic 8.8
urban 8.7
nuclear 8.7
horse 8.5
portrait 8.4
power 8.4
famous 8.4
rack 8.2
outdoors 8.2
high-angle gun 8.1
park 8.1
symbol 8.1
firearm 7.9
television equipment 7.8
people 7.8
outdoor 7.6
artillery 7.6
machine 7.5
support 7.5
environment 7.4
dirty 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 98.5
old 84.9
posing 47.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-54
Gender Male, 100%
Calm 97.8%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Confused 0.3%
Disgusted 0.1%
Happy 0%
Angry 0%

AWS Rekognition

Age 39-47
Gender Male, 100%
Calm 99.2%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Calm 99.3%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 45-51
Gender Male, 99.7%
Calm 98.6%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Angry 0.4%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.3%
Male 98.3%
Man 98.3%
Person 98.3%
Coat 96.8%

Categories