Human Generated Data

Title

Untitled (Nashville, Tennessee)

Date

October, 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3524

Human Generated Data

Title

Untitled (Nashville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October, 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3524

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.4
Human 99.4
Person 99.3
Person 99.2
Person 99.2
Person 98.9
Person 98.9
Person 98.3
Pedestrian 98.2
Person 97.1
Person 96.9
Wheel 92.9
Machine 92.9
Tarmac 87.9
Asphalt 87.9
Person 85
Transportation 84
Vehicle 83.7
Car 79.6
Automobile 79.6
Person 75.9
Person 72.1
Person 71.7
Road 71.4
Truck 66.1
Clothing 64.6
Apparel 64.6
People 61.2
Person 60.8
Building 58.6
Urban 57.4
Overcoat 57
Coat 57
City 56.5
Town 56.5
Tire 56.4
Person 46.5
Person 46.3

Clarifai
created on 2023-10-25

people 99.9
street 98.4
group together 98.2
monochrome 96.6
group 96.4
adult 95.3
man 93.2
vehicle 92.2
many 89.9
woman 89.1
transportation system 83
child 81.6
war 77.4
black and white 74.9
administration 74.4
police 74.2
two 73.4
military 72.6
road 72.3
soldier 72.1

Imagga
created on 2022-01-08

street 29.4
city 25.8
sidewalk 24
road 22.6
people 18.4
transportation 17.9
urban 17.5
car 16.8
travel 15.5
industrial 15.4
world 15.1
vehicle 13.9
building 12.8
transport 11.9
machine 11.8
adult 11.6
old 11.1
industry 11.1
truck 10.6
walking 10.4
track 10.4
business 10.3
men 10.3
passenger 10.3
black 10.2
person 10.2
architecture 10.2
man 10.1
danger 10
gas 9.6
crowd 9.6
work 9.4
construction 9.4
power 9.2
train 9.1
scene 8.7
heavy 8.6
motor vehicle 8.5
male 8.5
stone 8.4
sign 8.3
factory 8.1
wheeled vehicle 8.1
job 8
destruction 7.8
accident 7.8
dairy 7.8
military 7.7
sky 7.7
traffic 7.6
hat 7.5
tourism 7.4
safety 7.4
speed 7.3
dirty 7.2
women 7.1
worker 7.1
working 7.1
day 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.8
vehicle 96.6
land vehicle 94.1
outdoor 92.8
person 86.3
clothing 83.6
man 78.8
people 60.5
black and white 58.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Fear 77%
Sad 16.7%
Confused 1.7%
Angry 1.6%
Surprised 1.1%
Calm 0.9%
Disgusted 0.6%
Happy 0.4%

AWS Rekognition

Age 18-26
Gender Male, 76.9%
Fear 97.2%
Disgusted 0.6%
Calm 0.6%
Surprised 0.5%
Confused 0.3%
Sad 0.3%
Angry 0.3%
Happy 0.2%

AWS Rekognition

Age 6-16
Gender Male, 67.8%
Calm 77.9%
Disgusted 5.1%
Happy 4.4%
Sad 3.7%
Fear 2.8%
Confused 2.6%
Angry 2.1%
Surprised 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Wheel 92.9%
Car 79.6%
Truck 66.1%

Categories

Text analysis

Amazon

SERVICE
GAS
Gas
HEATERS
SPACE HEATERS
SPACE
17
NASHVILLE
QUICK
G-STORA
-
21
QUICK GEAT
CO
NASHVILLE Gast NERTONS CO
Gast
NASHVILLE BRIDG
BRIDG
CINES
SNOCK
GEAT
LLCANZING
NERTONS

Google

aMES SERVICE GAS Gas SPACE NEATERS TSLLE eSHELLE Oasnns c
aMES
SERVICE
SPACE
TSLLE
eSHELLE
Oasnns
c
GAS
Gas
NEATERS