Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1660

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1660

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Machine 99.5
Spoke 99.5
Person 99.1
Person 98.8
Person 98.7
Person 98.7
Person 98.5
City 98.3
Road 98.3
Street 98.3
Urban 98.3
Motorcycle 97.9
Transportation 97.9
Vehicle 97.9
Person 97.9
Person 97.8
Adult 97.8
Male 97.8
Man 97.8
Wheel 96.4
Clothing 96.1
Coat 96.1
Person 95.8
Person 89.2
Person 88.7
Wheel 87.5
Motor 86.4
Footwear 85.4
Shoe 85.4
Hat 84
Hat 82
Hat 76.4
Railway 66.8
Train 63
Person 60.2
Alloy Wheel 57.8
Car 57.8
Car Wheel 57.8
Tire 57.8
Hat 57.4
Bicycle 57.3
Cycling 57.3
Sport 57.3
Terminal 57.2
Train Station 57.2
Hat 57
Outdoors 56.9
Path 55.9
Sidewalk 55.9
Architecture 55.4
Building 55.4
Shelter 55.4
Engine 55.2
Overcoat 55.1

Clarifai
created on 2018-05-11

people 99.9
group together 99.4
group 98.3
adult 97.9
vehicle 97.1
administration 96.7
many 96.6
police 96.1
military 95.9
war 94.8
soldier 92.9
man 92.8
street 92.8
leader 90
two 87
one 86
road 85.2
several 84.7
crowd 84.7
transportation system 84.2

Imagga
created on 2023-10-07

jinrikisha 81.7
cart 67.7
wagon 50.1
seller 43.8
wheeled vehicle 37.6
vehicle 34.2
street 30.4
snow 24.7
transportation 23.3
road 20.8
wheelchair 19.1
city 18.3
old 15.3
wheel 15.2
outdoors 14.9
building 14.3
travel 14.1
chair 14
winter 13.6
car 13.4
architecture 13.3
urban 13.1
drive 12.3
cold 12.1
sidewalk 11.9
transport 11.9
carriage 11.5
machine 11.1
traffic 10.4
landscape 10.4
people 10
brick 9.4
outside 9.4
construction 9.4
man 9.4
industry 9.4
town 9.3
industrial 9.1
truck 8.9
cars 8.8
wheels 8.8
support 8.7
scene 8.7
weather 8.5
outdoor 8.4
help 8.4
care 8.2
equipment 8.2
light 8
mobility 7.8
tree 7.7
sky 7.7
lamp 7.6
seat 7.6
park 7.4
ice 7.4
day 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
black 69.1
white 69

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Male, 97.3%
Happy 56.3%
Sad 34.3%
Fear 10.8%
Surprised 7.4%
Angry 4.4%
Calm 2.6%
Disgusted 1.8%
Confused 0.9%

AWS Rekognition

Age 4-10
Gender Male, 57.6%
Angry 93.2%
Surprised 6.3%
Fear 6%
Sad 3.7%
Happy 1.1%
Calm 0.8%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 25-35
Gender Male, 95.1%
Calm 71%
Happy 14.1%
Surprised 7.7%
Fear 6.4%
Disgusted 5.7%
Sad 3.5%
Angry 1.3%
Confused 0.6%

AWS Rekognition

Age 16-24
Gender Female, 100%
Calm 28.9%
Fear 26%
Confused 13.5%
Happy 9.6%
Sad 9.4%
Surprised 7.9%
Disgusted 6.6%
Angry 2%

Feature analysis

Amazon

Person 99.1%
Motorcycle 97.9%
Adult 97.8%
Male 97.8%
Man 97.8%
Wheel 96.4%
Shoe 85.4%
Hat 84%

Categories

Text analysis

Amazon

HARRIGAN
OMAR
OMAR THEATRE
THEATRE
ROCK
SUN
OBRIEN
NOV
SUN MON
LOOK OUT
ROBERTAN
HARD
MON
fors
LOOK OUT FOR LOODMOTIVE
Tursday HARD ROCK
NOV y
Ocorgo
FOR LOODMOTIVE
y
МОР
Tursday

Google

HARD ROCKCo HARRIGAN OBRE
HARD
ROCKCo
HARRIGAN
OBRE