Human Generated Data

Title

Untitled (close up front view of crashed sedan )

Date

c. 1930-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4241

Human Generated Data

Title

Untitled (close up front view of crashed sedan )

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4241

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Machine 98.5
Wheel 98.5
Wheel 97.8
Workshop 97.3
Transportation 96.8
Vehicle 96.8
Car 96.8
Automobile 96.8
Car 93.1
Building 85.8
Spoke 60
Nature 59.6
Car 58.7
Countryside 58.2
Shelter 58.2
Outdoors 58.2
Rural 58.2

Clarifai
created on 2019-06-01

vehicle 99.9
transportation system 99.4
car 99.3
people 99.2
street 96.2
old 95
engine 94.2
vintage 92.8
group 92.3
two 92.3
abandoned 92.3
adult 90.6
retro 90.4
wheel 90.3
broken 90.2
one 89.7
no person 88.7
man 87.3
classic 86
drive 85.3

Imagga
created on 2019-06-01

car 52.9
vehicle 52.2
transportation 39.4
automobile 29.7
transport 28.3
auto 27.7
wheel 24.1
device 21.4
engine 21.2
old 20.9
road 19.9
drive 19.8
truck 16.8
speed 16.5
metal 16.1
cockpit 15.6
power 15.1
vintage 14
travel 12.7
machine 12.6
driving 12.6
rusty 12.4
motor 12.4
traffic 12.3
street 12
abandoned 11.7
rust 11.6
electric 11.5
classic 11.1
tire 11.1
garage 11
iron lung 10.8
hood 10.8
wreck 10.8
crash 10.8
accident 10.7
highway 10.6
technology 10.4
steel 9.7
broken 9.6
motor vehicle 9.5
headlight 9
equipment 8.9
bumper 8.9
train 8.9
parking 8.8
wheeled vehicle 8.7
track 8.6
respirator 8.6
industry 8.5
black 8.4
danger 8.2
shiny 7.9
passenger 7.9
antique 7.9
wheels 7.8
luxury 7.7
sky 7.6
chrome 7.5
fast 7.5
sports 7.4
military vehicle 7.4
tank 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

car 91.5
vehicle 91.4
land vehicle 88.4
wheel 85.6
black and white 78.9
tire 54.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Male, 59.2%
Surprised 7.4%
Sad 28.1%
Happy 13.4%
Angry 6%
Disgusted 6.4%
Confused 7.3%
Calm 31.4%

Feature analysis

Amazon

Wheel 98.5%
Car 96.8%

Categories

Captions