Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1653

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1653

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 99.7
Coat 99.7
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Railway 97
Terminal 97
Train 97
Train Station 97
Transportation 97
Vehicle 97
Person 94.3
Person 93.9
Person 93.3
Person 91.5
Face 90
Head 90
Person 87.7
Person 82.9
Person 81.4
Person 80.8
Person 77.6
Hat 75.7
Person 73.8
Jeans 70.1
Pants 70.1
Person 69.8
Person 67.5
Person 67.4
Person 67.2
Person 64.1
Outdoors 61.5
Person 60.1
Person 59.9
Person 58.3
Overcoat 57.7
Handrail 57.5
Boardwalk 56.4
Bridge 56.4
Person 56.2
Water 55.3
Waterfront 55.3

Clarifai
created on 2018-05-11

people 99.9
vehicle 99.4
adult 99.1
group together 98.8
group 98.3
railway 98.3
transportation system 98.2
train 97.7
war 97.5
two 97.4
one 97.2
monochrome 95.5
man 95.5
print 93
three 92.7
military 92.7
administration 92.7
four 91
watercraft 89.2
street 88.7

Imagga
created on 2023-10-05

track 30.5
gas pump 28.2
mechanical device 26.6
travel 26.1
pump 22.3
train 20.2
city 20
transportation 19.7
architecture 18.1
winter 17.9
mechanism 17.7
street 17.5
building 17.5
old 17.4
railway 15.7
inclined plane 15.6
landscape 15.6
snow 15.1
sky 14.7
boat 13.9
scene 13
machine 12.8
transport 12.8
railroad 12.8
rail 12.8
sidewalk 12.7
structure 12.6
device 12.5
water 12
road 11.7
trees 11.6
barrier 11.6
tourism 11.6
urban 11.4
station 11.2
town 11.1
lake 11
sea 11
house 10.9
light 10
rural 9.7
empty 9.4
journey 9.4
outdoor 9.2
mountain 8.9
weather 8.8
tourist 8.8
scenic 8.8
wall 8.8
line 8.6
day 8.6
vehicle 8.6
cold 8.6
tree 8.5
ocean 8.3
outdoors 8.2
new 8.1
black 7.9
wooden 7.9
parking meter 7.9
tracks 7.9
bridge 7.8
harbor 7.7
industry 7.7
perspective 7.5
wood 7.5
hill 7.5
destination 7.5
park 7.4
vacation 7.4
island 7.3
industrial 7.3
history 7.2
night 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Calm 63.1%
Angry 15.7%
Disgusted 9.3%
Surprised 7.2%
Fear 6.4%
Confused 3.4%
Sad 3.3%
Happy 2.2%

AWS Rekognition

Age 48-54
Gender Male, 99.7%
Calm 82.3%
Fear 9.8%
Surprised 6.8%
Confused 3%
Happy 2.5%
Sad 2.4%
Disgusted 1.3%
Angry 0.5%

AWS Rekognition

Age 13-21
Gender Female, 61.7%
Calm 92.5%
Surprised 6.6%
Fear 6%
Happy 3%
Sad 2.6%
Disgusted 1.2%
Confused 0.5%
Angry 0.5%

AWS Rekognition

Age 31-41
Gender Male, 99.1%
Calm 69%
Surprised 9.6%
Fear 8.6%
Disgusted 8.1%
Confused 5%
Sad 3%
Angry 2.6%
Happy 1.3%

AWS Rekognition

Age 18-26
Gender Male, 77.6%
Calm 71%
Sad 8.7%
Surprised 8.3%
Fear 7.7%
Disgusted 3.9%
Angry 2.7%
Happy 2.2%
Confused 2.1%

AWS Rekognition

Age 19-27
Gender Female, 59.2%
Calm 92.7%
Surprised 6.9%
Fear 6.7%
Sad 2.9%
Confused 0.5%
Angry 0.4%
Happy 0.4%
Disgusted 0.3%

AWS Rekognition

Age 24-34
Gender Male, 88.1%
Calm 52.8%
Confused 23.9%
Happy 8.5%
Fear 8.2%
Surprised 7.4%
Sad 4.5%
Disgusted 1.1%
Angry 0.9%

AWS Rekognition

Age 13-21
Gender Male, 67.5%
Calm 74.3%
Happy 19.9%
Surprised 7.1%
Fear 6.2%
Sad 2.7%
Angry 0.7%
Confused 0.6%
Disgusted 0.5%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Hat 75.7%
Jeans 70.1%

Categories

Text analysis

Amazon

BRIEN
MINES
TUESDAYS
ORGE
STABLES
HARRIGAN
THE
HARDROCK
HERVET
IRENE HERVET
STABLES MINES 785
IRENE
ADM10
785
KEERDEE
WARNE

Google

TUESDAYS STABLES MINES RIEN RD ROCK
TUESDAYS
STABLES
MINES
RIEN
RD
ROCK