Human Generated Data

Title

Untitled (Newark, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2653

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Newark, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2653

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Walking 100
City 100
Road 100
Street 100
Urban 100
Car 98.7
Transportation 98.7
Vehicle 98.7
Person 98.4
Person 98.2
Car 95.8
Person 95.8
Tarmac 95.4
Person 95.4
Clothing 94.5
Coat 94.5
Person 94.2
Person 93.6
Person 93.5
Machine 92.8
Wheel 92.8
Pedestrian 92.2
Wheel 92.2
Path 91.6
Sidewalk 91.6
Wheel 91.5
Person 91.5
Wheel 91.5
Neighborhood 91.1
Person 90.8
Wheel 87.7
Person 86.4
Formal Wear 85.9
Suit 85.9
Footwear 85.4
Shoe 85.4
Wheel 83.9
Shoe 83.8
Person 83.6
Alloy Wheel 81.6
Car Wheel 81.6
Spoke 81.6
Tire 81.6
Accessories 64.4
Bag 64.4
Handbag 64.4
Shoe 63.6
Person 62.9
Shoe 62.5
Person 59.1
Intersection 58
Suv 57.6
Sedan 57.2
Metropolis 57
Coupe 56.8
Sports Car 56.8
License Plate 55.5

Clarifai
created on 2018-05-10

people 99.7
street 99.6
monochrome 98.1
group together 98
car 97.9
road 97.8
vehicle 97.4
adult 97.3
transportation system 97
pavement 95.1
man 94.5
police 93.5
group 92.8
many 89.9
horizontal plane 87.4
administration 87.3
city 84.6
woman 84.2
parking lot 83.9
driver 83

Imagga
created on 2023-10-07

sidewalk 100
street 46
city 45.7
road 35.2
car 32.4
urban 32.3
traffic 25.7
travel 24.6
transportation 24.2
architecture 22.7
people 20.1
building 20
transport 19.2
cars 17.6
vehicle 16.7
sport 16.2
buildings 16.1
town 15.8
speed 14.7
automobile 14.4
outdoors 14.2
intersection 13.8
auto 13.4
business 13.4
runner 13.1
men 12.9
motion 12.8
athlete 12.6
crowd 12.5
drive 12.3
sky 12.1
walking 11.4
day 11
tourism 10.7
person 10.7
center 10.5
life 10.3
tourist 10
rush 9.8
pavement 9.8
trees 9.8
highway 9.6
downtown 9.6
scene 9.5
line 9.4
outdoor 9.2
square 9
taxi 8.9
high 8.7
move 8.6
walk 8.6
tree 8.5
modern 8.4
cab 8.3
man 8.1
women 7.9
automobiles 7.9
parking 7.9
adult 7.8
driving 7.7
summer 7.7
cityscape 7.6
park 7.4
new 7.3
passenger 7.3
job 7.1
working 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

road 100
outdoor 99.9
building 99.6
street 94.7
city 83.3
scene 74.5
way 71.6
sidewalk 59

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Male, 99.6%
Calm 70.9%
Happy 12.1%
Sad 8.1%
Surprised 7.7%
Fear 6.4%
Disgusted 1.4%
Angry 1.2%
Confused 0.7%

AWS Rekognition

Age 22-30
Gender Female, 51.7%
Sad 94.8%
Happy 26.7%
Calm 13.6%
Fear 7.5%
Surprised 7.4%
Disgusted 3%
Angry 1.9%
Confused 1.9%

AWS Rekognition

Age 21-29
Gender Male, 100%
Calm 82.8%
Happy 15.6%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Angry 0.4%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 11-19
Gender Male, 97.9%
Calm 85.8%
Fear 7%
Surprised 6.4%
Sad 4.5%
Angry 2.6%
Disgusted 1.3%
Confused 1.1%
Happy 0.8%

Feature analysis

Amazon

Car 98.7%
Person 98.4%
Wheel 92.8%
Shoe 85.4%
Handbag 64.4%

Categories

Text analysis

Amazon

TRUST
OF
COMPANY
THE
HUB
NEWARK
N
THE UN N TRUST COMPANY OF NEWARK
ARCAD
OCKED
OCKED 50
50
UN
BILLIARDS

Google

THE UNI N TRUST COMPANY OF NEWARK HUB OCKED 50
THE
UNI
N
TRUST
COMPANY
OF
NEWARK
HUB
OCKED
50