Human Generated Data

Title

Untitled (crashed car, people, with mile marker signs)

Date

c.1970, from earlier negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18482

Human Generated Data

Title

Untitled (crashed car, people, with mile marker signs)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, from earlier negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18482

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.8
Human 99.8
Car 99.3
Automobile 99.3
Transportation 99.3
Vehicle 99.3
Person 97.2
Wheel 95.6
Machine 95.6
Nature 93.2
Person 88.4
Outdoors 85.5
Road 74.9
Countryside 68.2
Person 65.4
Car 64.7
Photography 62.9
Photo 62.9
Tire 62.8
Portrait 61
Face 61
Gravel 59.7
Dirt Road 59.7
Rural 59.4
Tractor 56.1

Clarifai
created on 2023-10-22

vehicle 99.6
people 96.7
car 96.6
transportation system 96.4
machine 95.2
monochrome 94.4
truck 94.3
one 90.5
industry 89.7
no person 87.9
driver 87.3
storm 85
street 84
war 82.7
tank 81.8
wheel 81.3
accident 81.2
dust 80.2
bulldozer 80.1
tractor 79.1

Imagga
created on 2022-03-04

tank 100
vehicle 100
military vehicle 89.3
tracked vehicle 85.6
armored vehicle 73.8
wheeled vehicle 48.5
conveyance 44.8
war 21.1
artillery 20.9
gun 20.6
military 20.3
car 18.2
cannon 17.8
transport 17.3
machine 17.2
transportation 17
power 16.8
army 16.6
heavy 15.3
half track 14.8
road 14.4
weapon 13.1
sky 12.8
camouflage 11.8
wheels 11.7
truck 11.6
water 11.3
equipment 11.3
armament 11.3
old 11.1
boat 11.1
industry 11.1
field artillery 10.9
city 10.8
river 10.7
automobile 10.5
high-angle gun 10.3
work 10.2
armored 9.9
travel 9.9
barrel 9.8
weaponry 9.7
motor 9.7
engine 9.6
track 9.6
motor vehicle 9.6
sea 9.5
construction 9.4
land 9.3
speed 9.2
ocean 9.1
danger 9.1
industrial 9.1
technology 8.9
armor 8.9
armed 8.8
battle 8.8
building 8.7
wheel 8.6
traffic 8.5
amphibian 8.3
street 8.3
metal 8
night 8
forces 7.9
ship 7.8
soldier 7.8
conflict 7.8
rock 7.8
dangerous 7.6
drive 7.6
vintage 7.4
protection 7.3

Microsoft
created on 2022-03-04

outdoor 99.1
text 98.4
vehicle 97.1
land vehicle 91
car 90.7
black and white 90.2
wheel 79.1
white 63.1
tire 52
curb 7.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Calm 90.1%
Sad 2.4%
Confused 2.1%
Angry 1.4%
Happy 1.2%
Surprised 1.2%
Fear 1%
Disgusted 0.6%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 66%
Sad 26%
Confused 2.7%
Fear 1.6%
Happy 1.3%
Surprised 1.1%
Disgusted 0.9%
Angry 0.5%

AWS Rekognition

Age 29-39
Gender Female, 63.6%
Happy 60%
Sad 36.7%
Fear 1.6%
Surprised 0.6%
Angry 0.4%
Confused 0.2%
Disgusted 0.2%
Calm 0.2%

Feature analysis

Amazon

Person
Car
Wheel
Person 99.8%
Person 97.2%
Person 88.4%
Person 65.4%
Car 99.3%
Car 64.7%
Wheel 95.6%

Categories

Imagga

cars vehicles 99.3%

Text analysis

Amazon

HUDSON
LONDONDERRY
LOWELL
NASHUA
PELHAM
MANCHESTER
LONDONDERRY 5
15
5
11
7
9
M
RODOR
RODOR COVEETA REET
COVEETA
REET

Google

t HUDSON + NASHUA LONDONDERRY 5 MANCHESTER 11 < PELHAM 9 < LOWELL 15
t
HUDSON
+
NASHUA
LONDONDERRY
5
MANCHESTER
11
<
PELHAM
9
LOWELL
15