Human Generated Data

Title

Untitled (miners, Calumet, Pennsyvania)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1292

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (miners, Calumet, Pennsyvania)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1292

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Brick 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Photography 98.9
Head 97.7
Face 97.5
Clothing 96.1
Footwear 96.1
Shoe 96.1
Portrait 94.7
People 82.7
Body Part 79.4
Finger 79.4
Hand 79.4
Smoke 77.1
Coat 57.5
Sneaker 56.9
Jacket 56.7
Boot 56.1
Sitting 56
Baseball 56
Baseball Glove 56
Glove 56
Sport 56

Clarifai
created on 2018-05-11

people 100
one 99.8
adult 99.7
man 99
two 98.6
wear 98.1
group 97.5
military 95.6
recreation 94.7
group together 93
war 92.3
sit 92.1
three 91.6
portrait 91.3
outfit 90.8
administration 90.6
furniture 89.9
four 89.7
music 88.2
actor 88

Imagga
created on 2023-10-05

newspaper 100
product 100
creation 78.5
sculpture 35.6
statue 35.5
art 22.9
architecture 22.7
culture 21.4
ancient 19.9
history 17.9
old 17.4
man 17.2
stone 16
monument 14.9
religion 14.3
male 14.2
travel 14.1
person 13.5
god 13.4
city 13.3
historic 12.8
building 12.7
marble 12.6
tourism 12.4
landmark 11.7
antique 11.2
face 10.7
people 10.6
temple 10.6
human 10.5
adult 10.3
religious 10.3
famous 10.2
black 9.6
scholar 9.5
historical 9.4
fountain 9.3
traditional 9.1
sax 9.1
one 9
soldier 8.8
bronze 8.8
decoration 8.8
palace 8.7
spiritual 8.6
portrait 8.4
head 8.4
vintage 8.3
body 8
mythology 7.9
museum 7.8
military 7.7
spirituality 7.7
intellectual 7.6
detail 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.1
man 98.3
sitting 94.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 46.6%
Angry 42.8%
Surprised 7.9%
Fear 6.5%
Sad 3.6%
Confused 1%
Happy 0.7%
Disgusted 0.7%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 4.1%
Confused 0.5%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Shoe 96.1%

Categories

Imagga

paintings art 90.9%
people portraits 6.1%

Captions

Microsoft
created on 2018-05-11

a man sitting on a bench 59.2%
a man sitting on a bed 43.3%
a man sitting on the ground 43.2%