Human Generated Data

Title

Untitled (Marysville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.146

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marysville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.146

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Sun Hat 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 97.3
Hat 95.3
Hat 95.1
Person 92.3
Face 91.9
Head 91.9
Adult 89.6
Male 89.6
Man 89.6
Person 89.6
Hat 88.9
Coat 83.1
Person 70.9
Photography 70.6
Hat 69.9
Portrait 63.9
Cap 58
Shirt 56.6
Cowboy Hat 55.2

Clarifai
created on 2018-05-11

people 99.8
lid 98.7
group together 98
man 96.3
adult 96.1
group 95.7
three 95.2
four 94.5
uniform 94.4
administration 93.6
military 92.7
several 92.6
wear 92.1
veil 90.9
five 90.8
two 90.4
woman 84.7
police 83.5
leader 82.4
war 79.9

Imagga
created on 2023-10-06

hat 54.6
man 49
uniform 41.7
male 39.7
clothing 38.5
person 34.8
military uniform 33.3
cowboy hat 29.6
work 28.2
people 27.9
men 25.8
worker 24.9
helmet 24.7
portrait 21.4
covering 20.5
job 20.3
occupation 20.2
consumer goods 19.6
headdress 18.7
adult 18.3
safety 17.5
equipment 16.6
smile 15.7
industry 15.4
happy 15
engineer 15
construction 14.5
hand 14.4
face 14.2
guy 14
professional 13.9
industrial 13.6
cowboy 13.4
shirt 13.1
gun 12.7
two 12.7
handsome 12.5
nurse 12.2
couple 12.2
protection 11.8
profession 11.5
site 11.3
senior 11.2
builder 11.2
foreman 10.7
western 10.7
medical 10.6
building 10.5
smiling 10.1
hardhat 9.8
outdoors 9.7
style 9.6
black 9.6
looking 9.6
collar 9.6
boy 9.6
standing 9.6
weapon 9.4
model 9.3
business 9.1
care 9.1
team 9
working 8.8
together 8.8
protective 8.8
look 8.8
repair 8.6
workplace 8.6
patient 8.5
doctor 8.5
manager 8.4
tool 8.4
old 8.4
leisure 8.3
family 8
skill 7.7
attractive 7.7
hard 7.6
horse 7.6
pair 7.6
fashion 7.5
mature 7.4
20s 7.3
friendly 7.3
surgeon 7.2
hair 7.1
love 7.1
mask 7.1
happiness 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
man 91.5
people 72.6
white 64.4
old 58.1
work-clothing 11.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 53-61
Gender Male, 100%
Calm 97.3%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Confused 0.5%
Angry 0.5%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 27-37
Gender Male, 99.1%
Calm 96.8%
Surprised 6.8%
Fear 5.9%
Sad 2.6%
Confused 0.5%
Angry 0.3%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 56-64
Gender Male, 99.7%
Calm 87%
Angry 9.2%
Surprised 6.6%
Fear 5.9%
Sad 2.7%
Happy 0.7%
Disgusted 0.4%
Confused 0.4%

AWS Rekognition

Age 54-62
Gender Male, 99.3%
Happy 99.4%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.2%
Confused 0.1%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 71.3%
Sad 24.1%
Surprised 7.5%
Fear 6.4%
Happy 2.2%
Confused 1.6%
Angry 1.3%
Disgusted 1.1%

AWS Rekognition

Age 40-48
Gender Male, 94.8%
Calm 60%
Happy 20.5%
Surprised 8.1%
Fear 7.5%
Confused 6.2%
Sad 3.9%
Disgusted 1.1%
Angry 0.8%

Microsoft Cognitive Services

Age 62
Gender Male

Microsoft Cognitive Services

Age 73
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Hat 95.3%

Categories

Imagga

people portraits 65.2%
paintings art 29.9%
pets animals 3.9%

Text analysis

Amazon

EL
EL VER
NG
VER
E
VAI
CI
NG T
Che.
T
the

Google

ELVER VAI NG T Che
ELVER
VAI
NG
T
Che