Transportation’s always been a rather futuristic endeavor. When the first horseless carriages arrived on the scene, I’m sure a good number of people were astonished and amazed at the technology involved. “What will they come up with next?” was likely heard quite a bit.
But as you know, our inventions have taken us far further. We now have trains that travel hundreds of miles per hour, airplanes that take us around the world in a matter of hours, and cars that practically drive themselves.
Then again, that last one isn’t quite operational yet.
Yes, there are quite a few cars on the market that claim to have ‘self-driving’ capabilities. But as automakers like Tesla are finding out, some major problems need to be worked out before self-driving cars hit the mainstream and most of us feel comfortable riding in them.
Take what happened around Thanksgiving in San Francisco, for example.
According to the UK Daily Mail, a man was driving his Tesla Model S across the Bay Bridge. He was traveling around 55 miles per hour and moving with traffic when the vehicle began to change lanes and then suddenly slammed on the brakes, bringing the car to a moving speed of around 20 miles per hour.
Naturally, what occurred was a chain reaction accident. An eight-car pileup that left the two lanes of Interstate 80 shut down for about 90 minutes sent two people (including one child) to the hospital and left 16 others injured at the scene.
According to the driver, he had the vehicle in “full self-driving” mode.
This means that the car can change lanes, accelerate, brake, and for the most part, drive by itself without much driver interference or help. However, according to the manufacturer’s warning, it still requires an alert and attentive human driver at the wheel should something unforeseen occur.
But in this case, there’s not much the driver could have done. There wasn’t a car coming at him; there wasn’t anything for him to see that the car couldn’t. It just braked extra hard all of the sudden.
It is important to note that this occurred just hours after Elon Musk had announced that the “full self-driving” software could be downloaded and used by all Tesla drivers. Before this, only drivers with high safety marks had been allowed to use the software.
But it’s also crucial to point out that this driver isn’t the only one who experienced problems, or even this exact same issue when operating the car in “full self-driving” mode. In fact, according to CNN, Tesla itself has warned drivers upon downloading the software that it “may do the wrong thing at the worst time,” hence the need for an alert human driver.
And dozens of Tesla drivers have reported similar instances where the car has suddenly applied the brakes “without warning.” Some have nearly had accidents as a result.
Naturally, such instances, and a growing number of them, have led the National Highway Traffic Safety Administration to begin thinking about an investigation into both the accidents supposedly caused by the software and the software itself.
According to the agency, the problem is so serious that, given the results of the investigations, a recall might be required.
Tesla and its software are also not on good terms with the California Department of Motor Vehicles, as it claims that the automaker has been deceptive with its marketing of both self-driving and autopilot software.
The department alleges that the company portrays the software as something that makes the car into an “autonomous vehicle” and has no need for human interference. Clearly, that is not the reality.
In 2018, an accident left two teens dead due to what is now ruled negligence by Tesla. In 2020, three deaths were reported by The Associated Press from separate autopilot accidents. In 2021, multiple accidents were reported, including one that left a 35-year-old man dead after his car took him careening into an overturned semi.
This year, eleven people have died in just one four-month period due to EV crashes involving automated driving systems.
As I said, the technology might be making headway. But it’s apparently nowhere near where it needs to be, at least not before I get into a self-driving car.