Human Generated Data

Title

Untitled (Marysville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.147

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marysville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.147

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

City 100
Road 100
Street 100
Urban 100
Alley 100
Face 100
Head 100
Photography 100
Portrait 100
Clothing 99.9
Shorts 99.9
Brick 99.9
Person 99.6
Adult 99.6
Male 99.6
Man 99.6
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Walking 99.2
Person 98.9
Child 98.9
Female 98.9
Girl 98.9
Path 98.6
Sidewalk 98.6
Shirt 97.8
Person 97.6
Adult 97.6
Male 97.6
Man 97.6
Person 97.3
Pants 96.2
Person 95
People 93.4
Person 88.5
Baseball Cap 85.8
Cap 85.8
Formal Wear 70.6
Outdoors 68.2
Dress 65.8
Hat 65.3
Coat 61
Hat 59.3
Neighborhood 56.8
Jeans 56.5
Accessories 56.3
Bag 56.3
Handbag 56.3
Baseball 56.1
Baseball Glove 56.1
Glove 56.1
Sport 56.1
Suit 55.7
Architecture 55.6
Building 55.6
Wall 55.6
Shelter 55.5
Body Part 55.3
Finger 55.3
Hand 55.3
Blouse 55.3

Clarifai
created on 2018-05-11

people 99.9
adult 98.6
man 97.5
group 97.5
two 97.2
group together 96.9
three 95.8
woman 93.7
four 93
administration 90.5
wear 90.4
portrait 89.1
street 88.5
leader 87.9
several 87.3
five 85
child 82.1
veil 79.3
outfit 75.7
home 75.6

Imagga
created on 2023-10-06

man 46.4
male 43.5
people 31.2
hat 30.7
person 29.7
adult 25.4
worker 23.4
business 23.1
engineer 22
portrait 22
happy 21.9
clothing 19.8
industry 19.6
helmet 19.3
men 18.9
architect 18.3
work 18.2
professional 18.1
handsome 17.8
occupation 17.4
smiling 17.4
construction 17.1
smile 16.4
job 15.9
builder 15.6
suit 15.5
guy 14.9
standing 14.8
grandfather 14.5
foreman 14.2
businessman 14.1
looking 13.6
building 13.6
casual 13.5
face 13.5
engineering 13.3
uniform 12.8
hardhat 12.7
senior 12.2
manager 12.1
attractive 11.9
architecture 11.7
designer 11.6
executive 11.6
hand 11.4
couple 11.3
outdoors 11.2
corporate 11.2
profession 10.5
one 10.4
boy 10.4
black 10.3
model 10.1
contractor 9.7
success 9.7
lifestyle 9.4
safety 9.2
confident 9.1
holding 9.1
industrial 9.1
kin 9
team 9
cheerful 8.9
style 8.9
family 8.9
home 8.8
look 8.8
happiness 8.6
workplace 8.6
two 8.5
employee 8.4
mature 8.4
old 8.4
fashion 8.3
successful 8.2
office 8.2
group 8.1
working 8
child 7.8
employment 7.7
father 7.7
skill 7.7
outside 7.7
30s 7.7
youth 7.7
equipment 7.6
businesspeople 7.6
meeting 7.5
site 7.5
human 7.5
mother 7.4
cowboy hat 7.3
20s 7.3
businesswoman 7.3
life 7.2
hair 7.1
indoors 7
modern 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
person 96.3
standing 84

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Male, 85.2%
Sad 90.2%
Angry 55.2%
Surprised 6.3%
Fear 6%
Calm 0.9%
Confused 0.5%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 50-58
Gender Female, 99.9%
Calm 97.6%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Confused 0.5%
Angry 0.4%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 48-56
Gender Male, 100%
Calm 98.8%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Happy 0.4%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 16-22
Gender Male, 71.8%
Calm 92.1%
Surprised 6.4%
Fear 6%
Sad 2.9%
Angry 2.3%
Confused 1.3%
Happy 1.3%
Disgusted 0.3%

AWS Rekognition

Age 12-20
Gender Male, 99.7%
Calm 91%
Surprised 6.4%
Fear 5.9%
Sad 4.3%
Confused 1.5%
Angry 1%
Happy 0.8%
Disgusted 0.2%

Microsoft Cognitive Services

Age 70
Gender Male

Microsoft Cognitive Services

Age 60
Gender Male

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Adult 99.6%
Male 99.6%
Man 99.6%
Child 98.9%
Female 98.9%
Girl 98.9%
Hat 65.3%
Jeans 56.5%

Text analysis

Amazon

GENE
GENE NEER
NEER
JEWELER
AND
REPAIRING
DRY
OPTOM
CLEANING
NITRICK
OPER
OPER تحت
PIWO
تحت

Google

GENE NEER JEWELER ADRY CLEAN AND REPAIR
GENE
NEER
JEWELER
ADRY
CLEAN
AND
REPAIR