Human Generated Data

Title

Untitled (crashed car, people, with mile marker signs)

Date

1954

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18489

Human Generated Data

Title

Untitled (crashed car, people, with mile marker signs)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.8
Human 99.8
Automobile 99.1
Car 99.1
Transportation 99.1
Vehicle 99.1
Machine 97.1
Wheel 97.1
Person 96.5
Nature 94.6
Person 86.9
Road 82.7
Outdoors 81.9
Person 79
Car 70.4
Dirt Road 64.3
Gravel 64.3
Tarmac 60.2
Asphalt 60.2
Car 59.6
Offroad 57.8
Tire 57.4
People 57.2
Bumper 56.6

Imagga
created on 2022-03-04

vehicle 53.5
car 50.4
tank 28.9
military vehicle 28.1
motor vehicle 25.6
wheeled vehicle 24.1
tracked vehicle 23.5
transportation 22.4
armored vehicle 20.3
auto 20.1
racer 18.8
speed 17.4
automobile 17.2
road 17.2
wheel 16.2
travel 16.2
power 15.9
wreckage 15.8
transport 15.5
water 15.3
part 14.7
engine 14.4
landscape 14.1
industrial 13.6
conveyance 13.2
sea 12.6
drive 12.3
industry 12
danger 11.8
ocean 11.6
sky 11.5
old 11.1
dirty 10.8
motor 10.8
river 10.7
military 10.6
truck 10.6
street 10.1
vintage 9.9
sand 9.9
metal 9.7
scene 9.5
machine 9.2
city 9.1
destruction 8.8
urban 8.7
device 8.7
broken 8.7
war 8.7
track 8.6
work 8.6
outdoor 8.4
smoke 8.4
gun 8.3
tourism 8.2
coast 8.1
man 8.1
cannon 8
light 8
building 7.9
day 7.8
desert 7.8
rock 7.8
black 7.8
driving 7.7
motion 7.7
construction 7.7
heavy 7.6
clouds 7.6
traffic 7.6
beach 7.6
weapon 7.5
field 7.5
land 7.5
fast 7.5
boat 7.4
environment 7.4
cockpit 7.4
protection 7.3
tranquil 7.2
scenery 7.2
skeleton 7.2

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.2
outdoor 98.2
vehicle 95.3
black and white 89.1
car 85.1
land vehicle 80
white 69.4
old 58.7
ship 51.8

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 96.1%
Happy 79.4%
Fear 14.7%
Sad 2.8%
Calm 1.4%
Surprised 0.9%
Disgusted 0.3%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 31-41
Gender Male, 73%
Calm 30.6%
Sad 27.8%
Happy 26.2%
Fear 9.7%
Disgusted 1.9%
Confused 1.6%
Angry 1.2%
Surprised 1%

AWS Rekognition

Age 23-33
Gender Male, 99.6%
Calm 71.6%
Sad 25.6%
Confused 2%
Happy 0.3%
Disgusted 0.2%
Surprised 0.1%
Angry 0.1%
Fear 0%

Feature analysis

Amazon

Person 99.8%
Car 99.1%
Wheel 97.1%

Captions

Microsoft

a vintage photo of a person driving a car 62.9%
a vintage photo of a car 62.8%
a vintage photo of a parked car 62.7%

Text analysis

Amazon

HUDSON
LOWELL
NASHUA
LONDONDERRY
PELHAM
MANCHESTER
LONDONDERRY 5
15
5
11
7
9
و
UN

Google

+ HUDSON + NASHUA LONDONDERRY 5 MANCHESTER 11 < PELHAM + LOWELL 15
LONDONDERRY
<
5
PELHAM
+
LOWELL
15
HUDSON
NASHUA
MANCHESTER
11