Human Generated Data

Title

Untitled (Marysville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.166

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marysville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.166

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Sun Hat 100
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Hat 97.9
Adult 97.8
Male 97.8
Man 97.8
Person 97.8
Adult 97.4
Male 97.4
Man 97.4
Person 97.4
Person 96.3
Face 96.1
Head 96.1
Hat 93.5
Hat 81.4
Cowboy Hat 57.4
Coat 56.2

Clarifai
created on 2018-05-11

people 99.9
adult 98.2
monochrome 97.8
man 97.3
lid 96.7
group together 96.4
group 95.7
wear 95.6
military 95.1
administration 93.7
two 92.8
portrait 92.8
war 91.7
three 91.7
woman 91.5
four 91.2
uniform 91.2
veil 91.1
several 91.1
street 89.1

Imagga
created on 2023-10-06

cowboy hat 100
hat 100
headdress 85.9
clothing 56.4
consumer goods 29.4
covering 29.1
man 22.2
statue 20
old 19.5
sculpture 19.1
male 18.4
person 17.3
portrait 16.2
religion 16.1
art 13.6
face 13.5
stone 13.5
seller 13.3
ancient 13
people 12.8
two 12.7
cowboy 12
travel 12
guy 11.9
city 11.6
western 11.6
vintage 11.6
black 11.4
hand 11.4
religious 11.2
one 11.2
culture 11.1
adult 11
architecture 10.9
building 10.7
mysterious 10.7
happy 10.6
tourism 9.9
history 9.8
looking 9.6
historical 9.4
shirt 9.3
outdoors 8.9
style 8.9
posing 8.9
interior 8.8
look 8.7
hair 8.7
artistic 8.7
love 8.7
god 8.6
model 8.5
fashion 8.3
uniform 7.8
antique 7.8
men 7.7
military 7.7
war 7.7
mystery 7.7
temple 7.6
head 7.5
dark 7.5
senior 7.5
monument 7.5
famous 7.4
historic 7.3
equipment 7.3
lady 7.3
pose 7.2
handsome 7.1
smile 7.1
together 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 97.5
outdoor 89.7
old 86.1
older 23.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-54
Gender Male, 99.4%
Happy 84.2%
Calm 7.9%
Surprised 7.6%
Fear 6.1%
Sad 3.4%
Confused 0.7%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 22-30
Gender Male, 99.8%
Calm 87.2%
Happy 9.2%
Surprised 6.5%
Fear 6%
Sad 2.3%
Confused 1%
Disgusted 0.7%
Angry 0.5%

AWS Rekognition

Age 35-43
Gender Male, 99.7%
Happy 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Confused 0%
Disgusted 0%
Calm 0%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Hat 97.9%

Categories

Imagga

food drinks 65.8%
paintings art 31.6%

Captions

Text analysis

Amazon

EL
EL VERS
CIGARS
VERS
Chester

Google

ELVERS GAR
ELVERS
GAR