Human Generated Data

Title

Untitled (Marysville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2635

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marysville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2635

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Sun Hat 100
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Person 97.3
Hat 96.2
Hat 96.2
Person 92.5
Face 91.9
Head 91.9
Adult 84.5
Person 84.5
Female 84.5
Woman 84.5
Coat 82.5
Hat 80
Hat 73.3
Cap 70.7
Person 69.2
Photography 56.4
Shirt 55.7
Portrait 55.4
Cowboy Hat 55.2

Clarifai
created on 2018-05-10

people 99.8
lid 98.9
group together 95.8
man 95.2
adult 92.6
group 92.2
uniform 91.4
veil 89.5
three 89.4
military 87.9
administration 87.6
child 87.2
four 86.6
five 83.7
several 83.6
woman 83.4
wear 82.2
fedora 79.9
war 79.9
boy 76.4

Imagga
created on 2023-10-06

hat 78.7
cowboy hat 56.7
man 49
male 42.5
headdress 35.6
clothing 35.5
person 33.7
people 29
work 25.9
portrait 25.9
worker 25.8
men 24
uniform 20.5
job 20.3
occupation 20.2
helmet 19.9
adult 19.5
industry 18.8
face 17.7
covering 17.2
safety 16.6
construction 16.2
consumer goods 16.1
happy 15.7
black 15
industrial 14.5
hand 14.4
handsome 14.3
smile 14.2
equipment 14.1
guy 13.9
engineer 13.3
senior 13.1
cowboy 13
foreman 12.5
old 11.8
builder 11.7
smiling 11.6
shirt 11.2
looking 11.2
professional 11.2
two 11
model 10.9
western 10.7
military uniform 10.6
profession 10.5
workplace 10.5
nurse 10.5
couple 10.5
style 10.4
business 10.3
site 10.3
protection 10
building 9.8
hardhat 9.8
attractive 9.8
medical 9.7
look 9.6
manager 9.3
one 9
outdoors 9
standing 8.7
architect 8.7
skill 8.7
collar 8.6
serious 8.6
engineering 8.6
holding 8.2
care 8.2
suit 8.1
team 8.1
working 7.9
hair 7.9
together 7.9
factory 7.7
repair 7.7
horse 7.6
doctor 7.5
mature 7.4
20s 7.3
surgeon 7.3
confident 7.3
pose 7.2
mask 7.2
family 7.1
posing 7.1
love 7.1
happiness 7
architecture 7

Microsoft
created on 2018-05-10

person 99.9
man 90.3
old 82.1
people 62.6
group 55.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-62
Gender Male, 100%
Calm 98.1%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Angry 0.5%
Confused 0.4%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 26-36
Gender Male, 93.6%
Calm 98.7%
Surprised 6.6%
Fear 5.9%
Sad 2.2%
Confused 0.3%
Angry 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 57-65
Gender Male, 99.5%
Calm 92.8%
Surprised 6.4%
Fear 5.9%
Angry 3.8%
Sad 2.9%
Happy 0.4%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 63-73
Gender Male, 99%
Happy 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 19-27
Gender Male, 99.8%
Calm 64.4%
Sad 40.2%
Surprised 7.6%
Fear 6.3%
Confused 4%
Happy 1.4%
Disgusted 1.4%
Angry 1.1%

AWS Rekognition

Age 30-40
Gender Male, 86.9%
Calm 87.7%
Fear 7%
Surprised 6.5%
Sad 4.5%
Angry 1.5%
Happy 1.1%
Confused 0.7%
Disgusted 0.5%

Microsoft Cognitive Services

Age 58
Gender Male

Microsoft Cognitive Services

Age 69
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Hat 96.2%
Female 84.5%
Woman 84.5%

Categories

Imagga

people portraits 66.8%
paintings art 28.3%
pets animals 3.5%

Text analysis

Amazon

EL
EL VER
VER
E
VA
7
N.G 7
N.G
CI
Che