Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4369.6

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4369.6

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Hat 100
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Adult 98
Male 98
Man 98
Person 98
Coat 97.6
Adult 97.2
Male 97.2
Man 97.2
Person 97.2
Person 96.8
Person 87.2
Face 83.2
Head 83.2
Person 74.9
Person 73.5
Outdoors 72.2
Person 71.9
Worker 57.6
Cap 57.3
People 56.7
Architecture 56.6
Building 56.6
Factory 56.6
Hospital 56.4
Hardhat 55.8
Helmet 55.8
Weapon 55.8
Sun Hat 55.8
Captain 55.7
Officer 55.7
Carpenter 55.4
Manufacturing 55.2
Gun 55.2

Clarifai
created on 2018-05-09

people 99.9
group together 99.1
group 98.7
adult 97.8
several 97
administration 97
man 94.9
military 94.8
war 94.8
wear 94.8
leader 94.4
many 93.8
four 90.6
outfit 90.5
law 87.4
five 86.7
uniform 85.1
three 83.5
woman 82.8
soldier 82.3

Imagga
created on 2023-10-05

lab coat 39.9
coat 36.1
man 32.2
person 24.6
male 24.1
worker 20.2
work 18.9
people 17.8
garment 16
clothing 15.8
nurse 15.2
adult 14
men 13.7
working 13.2
equipment 12.9
industry 11.9
mask 11.8
health 11.8
job 11.5
medical 11.5
medicine 11.4
building 11.1
business 10.9
uniform 10.7
hospital 10.6
old 10.4
portrait 10.3
industrial 10
builder 10
businessman 9.7
black 9.6
doctor 9.4
surgeon 9.1
patient 9.1
protection 9.1
looking 8.8
lifestyle 8.7
construction 8.5
professional 8.4
hand 8.3
safety 8.3
holding 8.2
human 8.2
care 8.2
steel 7.9
seller 7.6
happy 7.5
clothes 7.5
one 7.5
room 7.3
science 7.1
day 7.1
happiness 7

Google
created on 2018-05-09

Microsoft
created on 2018-05-09

person 96.1
man 93.7
posing 68.4
old 68

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Male, 99.7%
Calm 96.8%
Surprised 6.6%
Fear 5.9%
Sad 2.3%
Confused 1.6%
Happy 0.2%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 93.4%
Surprised 6.7%
Fear 6%
Sad 2.5%
Confused 2.3%
Disgusted 1%
Happy 0.7%
Angry 0.4%

AWS Rekognition

Age 31-41
Gender Male, 84.9%
Happy 70.8%
Fear 8%
Surprised 8%
Disgusted 6.5%
Calm 5.7%
Sad 3.9%
Confused 3.4%
Angry 1.3%

AWS Rekognition

Age 18-26
Gender Female, 98.3%
Calm 79%
Surprised 10.6%
Fear 6.9%
Sad 3.9%
Angry 3.6%
Confused 1.7%
Happy 1.3%
Disgusted 0.9%

AWS Rekognition

Age 28-38
Gender Female, 94.7%
Calm 65.9%
Happy 9.8%
Fear 9%
Sad 8.6%
Surprised 7.2%
Confused 2.6%
Disgusted 1.8%
Angry 1%

Feature analysis

Amazon

Adult 98.6%
Male 98.6%
Man 98.6%
Person 98.6%

Categories

Text analysis

Amazon

College
and
Art
University
of
(Harvard
Fellows
President and Fellows of Harvard College (Harvard University Art Museums)
Harvard
Museums)
President
P1970.4369.0006

Google

@ President and Fellows of Harvard College (Harvard University Art Museums) P1970.4369.0006
@
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4369.0006