Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1650

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1650

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Adult 99.7
Male 99.7
Man 99.7
Person 99.7
Coat 99.3
Walking 99.2
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 98.6
City 98.5
Road 98.5
Street 98.5
Urban 98.5
Stick 98.4
Person 96.3
Person 94.8
Machine 94
Wheel 94
Coat 87.9
Person 83.9
Footwear 74.7
Shoe 74.7
Motorcycle 69.3
Transportation 69.3
Vehicle 69.3
Overcoat 64.8
Spoke 58
Cane 57.2
Jeans 55.6
Pants 55.6
Back 55.6
Body Part 55.6
Bicycle 55.5
Cycling 55.5
Sport 55.5
Hat 55.1

Clarifai
created on 2018-05-11

people 99.9
group together 99.2
adult 97.5
man 96.8
street 96.6
group 94.9
vehicle 94.4
two 93.1
administration 92.5
military 92.2
wear 92.1
transportation system 91.7
one 91
uniform 89.8
four 89.5
police 88.9
war 88
road 87.4
leader 84.5
three 83.4

Imagga
created on 2023-10-05

jinrikisha 100
cart 81.4
wagon 61.8
wheeled vehicle 41.5
seller 40.9
man 34.3
people 25.7
vehicle 24.3
male 22
person 21.3
street 20.2
adult 20
city 18.3
walking 16.1
lifestyle 13.7
suit 13.7
active 13.5
fashion 12.8
business 12.8
urban 12.2
road 11.7
walk 11.4
happy 11.3
outdoors 11.2
attractive 11.2
old 10.4
men 10.3
industry 10.2
musical instrument 10
outdoor 9.9
hat 9.8
businessman 9.7
one 9.7
couple 9.6
guy 9.5
building 9.5
women 9.5
work 9.4
senior 9.4
transportation 9
activity 9
job 8.8
standing 8.7
happiness 8.6
outside 8.6
smile 8.5
portrait 8.4
wheelchair 8.3
leisure 8.3
care 8.2
protection 8.2
industrial 8.2
life 8
accordion 7.9
corporate 7.7
summer 7.7
sky 7.7
two 7.6
car 7.6
help 7.4
safety 7.4
back 7.3
worker 7.3
looking 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.3
person 97.6
way 41.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Male, 77.2%
Calm 78.2%
Surprised 9.8%
Fear 6.3%
Sad 5.5%
Angry 2.9%
Disgusted 2.3%
Happy 1.9%
Confused 1.2%

AWS Rekognition

Age 28-38
Gender Female, 96.1%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.4%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0%

AWS Rekognition

Age 11-19
Gender Male, 80.3%
Fear 52.5%
Calm 30.4%
Angry 12.3%
Surprised 6.7%
Confused 5.4%
Sad 4.4%
Happy 1.5%
Disgusted 1.5%

AWS Rekognition

Age 13-21
Gender Male, 94.5%
Calm 91.6%
Surprised 6.6%
Fear 6.2%
Sad 4%
Angry 0.8%
Confused 0.6%
Disgusted 0.5%
Happy 0.4%

Feature analysis

Amazon

Adult 99.7%
Male 99.7%
Man 99.7%
Person 99.7%
Coat 99.3%
Wheel 94%
Shoe 74.7%
Motorcycle 69.3%
Jeans 55.6%

Text analysis

Amazon

HARRIGAN
ROCK
OBRIEN
THEATRE
MAR THEATRE
MAR
Georgo
'HARD ROCK
'HARD
9

Google

MAR THEATRE HAR ROCK HARRIGAN OBRIEN
MAR
THEATRE
HAR
ROCK
HARRIGAN
OBRIEN