Why Self-Driving Cars Are NOT The Future
By Zack Pack
It seems like self-driving cars have always been the hot item for disruptive innovators in the auto industry. The image of sitting back, Minority Report style, in a sleek new vehicle with an AI-powered robot that effortlessly guides your car across town has bewitched many starry-eyed investors. In fact, over $100 billion dollars have been invested in self-driving technology since it was first seriously proposed. But what have these hopeful investors received for their money besides some flashy tech showcases? Why have self-driving cars been so difficult to go mainstream?
Critics point to the technology as being deficient. It’s nearly impossible for computers to understand all the real-world nuances that human drivers must interpret each time they take the wheel. Poorly paved roads, unpainted lines, a fallen tree… for human beings these are things we can quickly recognize and adapt to. For even the most sophisticated AI brains though, there’s no comparison. There’s just too much that could go wrong that they can’t predict. Even Anthony Levandowski, the co-founder of Google’s self-driving program is dubious. Fully autonomous self-driving cars are “an illusion” he told Bloomberg. For every successful demo, there might be dozens of failed ones. Until AI can capture all the context absorbed by a human brain while driving, it will still be a technology that’s less adaptable and more prone to accidents than its human-brained counterpart.
So are self-driving cars just a big scam? Well… kinda. While Tom Cruise isn’t going to be jumping from one self-driving car to the next anytime soon, the auto industry has come a long way towards driver-assisted cars. The current technology of self-driving cars on the market are cars that will “automatically brake for you if they anticipate a collision, cars that help keep you in your lane, and cars that can mostly handle highway driving.” In each of these cases though, there’s still a person behind the wheel supervising everything and avoiding accidents.
Technological hurdles aside, if we could develop the AI that makes self-driving cars as safe as human-driven cars, they’d still have quite a few other hurdles to overcome before going mainstream.
The biggest hurdle, perhaps, is the problem of liability.
Last week, a man in North Carolina was driving at night, following his GPS. The GPS led him to a bridge that the man couldn’t see was unfinished. He then drove off the bridge, crashed upside down in the river below and died. His GPS didn’t show that a portion of the bridge had been washed away – instead it went on mindlessly recommending it as the fastest route. After the man’s death, questions came up about who should be held responsible. Was it all the man’s fault? What about the fault of the city for not repairing the bridge? The state? The bridge manufacturer? What about the GPS technology that got it wrong? Should they pay out? It wasn’t clear where the fault lay and for that reason, all parties involved were vulnerable to lawsuits.
Another news story from last week: an attorney in Orange County filed a class-action suit against Kia and Hyundai for not having immobilizers in their older-model vehicles. These immobilizers would prevent the engine from starting without an associated key fob present. The lack of these devices, the attorney argued, have made Kias and Hyundais easier to steal and were popular targets for car thieves who would then go on to use the stolen vehicles for other crimes. While the suit seems like an obvious case of blaming-the-victim instead of the criminal, it also highlights how the question of “who is to blame for automotive deficiencies” is far from settled.
The list of liabilities continues to expand as well. The National Highway Transportation and Safety Administration (NHTSA) has only demanded more and more accountability from car manufacturers regarding auto safety regulations over the years. According to NHTSA (an arm of the Department of Transportation), all vehicles MUST include specific types of seatbelts, they MUST disclose the locations of where all their parts are assembled (via the Labeling Act), they MUST follow all cybersecurity restrictions, and if a new safety recall should arise, the manufacturer MUST fix them at their own expense. Today, about one in four vehicles on the road have an unresolved safety recall on them which has increased every year since the recall program’s inception.While some may say this is a good thing to have that much oversight around safety, it also does a lot to discourage manufacturers from sticking their necks out for potentially unsafe innovations.
The EPA is also squeezing vehicle manufacturers with new regulations – tightening its emission standards and adding restrictions that car manufacturers find increasingly difficult to abide by. As David Shepardson from Reuters said,
New rules [that] take effect in the 2023 model year… require a 28.3% reduction in vehicle emissions through 2026. The rules will be challenging for automakers to meet, especially for Detroit’s Big Three automakers. General Motors (GM.N), Ford Motor Co (F.N) and Chrysler-parent Stellantis NV (STLA.MI).
With all this red tape, automotive manufacturers are already feeling the weight of big brother pressing on their shoulders and would be reluctant to go all in on self-driving vehicles without all the safety concerns rigorously tested and approved to the point they can be sufficiently indemnified from lawsuits.
Further, what’s going to happen when one of those fully autonomous self-driving cars accidentally runs someone over? Even if the manufacturers can’t be held accountable (unlikely), our lawsuit-heavy country will find someone to blame. And when they do, that someone is going to have to shell out big time. This then opens the door for all sorts of unscrupulous activity. Folks will be gaming the system jumping in front of self-driving cars and pretending they were hit hoping for another big pay day. Eventually, the lawsuits will stack up until legislators attempt to settle the scope of responsibility for each participant, which will then be challenged, then appealed, then the question of who is responsible will dissolve into the land of legalize where few innovations ever come out alive.
Perhaps in another country with a more authoritative government, the liability issues can be overcome. Or maybe, self-driving cars can be controlled remotely by gig-workers using VR or something (so someone is still driving the car, they’re just not doing it from the inside), and they’d be accountable. However, the dream of self-driving vehicles still has a world of hurdles before it can be a realty in the USA and is something eager investors should consider before opening up their checkbooks.
There’s a difference in the technology that’s possible and the technology that’s feasible. Perhaps self-driving cars will someday become possible from a purely technological standpoint. However, it’s far more likely we’ll continue to see them through the lens of a Hollywood movie rather than the lens of our own windshields.
14 Comments
Su Terry
Many good points here. I especially love this: “It’s nearly impossible for computers to understand all the real-world nuances that human drivers must interpret each time they take the wheel. Poorly paved roads, unpainted lines, a fallen tree… for human beings these are things we can quickly recognize and adapt to. ”
The tech that makes a captcha screen where you have to choose all the tractors to gain access, is exactly what humans can do that A.I. can’t.
And I don’t think AI assisted vehicles are going to be any help either. The first innovation of this sort was probably the ABS system. I’ve always hated ABS (anti-lock brake system) and felt it is for people who can’t drive. If you know how to drive, ABS is just a hindrance.
Bravo. Tell it like it is.
Leopardeyes
I have flown Harriers for many years, I find it far easier to navigate the skies with the three dimensional parameters than driving in heavy freeway traffic at speed. The human eye can detect and anticipate illogical movements by other drivers, often when they make small changes in speed or direction. The programming for AI control of the vehicle must be able to intuitively sense such actions, while compensating for pot holes, rough pavement, brake lights sequentially coming down the line of traffic toward your vehicle, etc. I was once told the method to mastering LA traffic was to watch the car about a quarter mile in front of you, and do what that car does, worked well in the 1980s. Will an AI chip know such things? Probably not.
tex2519
Self driving or EV are not for me.
If I can’t get fuel for my vehicles, I’ll ride one of my horses.
….and YES, I would revert back.
Julie Boatner
I live in the Phoenix metro area and we have waymo cars that are driverless. I imagine this article is saying that individuals will not be owning driverless cars, but they are certainly in use here for hire.
Doyle
If you enter a driverless vehicle you are subjecting yourself to the whims of its program, a program that will make life and death decisions for you and most likely not in your favor. That aside, the sensor package has to be in top operating condition at all times or that program will misinterpret the data and likely not in your favor. Ever drive a car with front and back sensors for braking and they became inoperable when it was snowing and the roads became a slushy and blocked the sensors, the first time I drove such a car that happened to me.
Steve
Autonomous vehicles are the equivalent of corporations. People can act thru corporations and be free of personal accountability. Autonomous vehicles can kill people and it’s only a civil lawsuit and money to restore a murder.
It’s a disgusting aspect of corrupt corporations.
Martin
“automatically brake for you…”, not “break”.
There’s a pandemic of homophone confusion, I see it everywhere with increasing frequency. It needs to be prevented, probably in elementary school.
Red Wave Dave
“Your car will automatically break for you…”. Thanks but no thanks. My car automatically breaks and is in the shop for repair too often as is.
CMF
I heard the head of a district state run Stop DWI office say in a public class that they could not wait for self driving cars so they could go out and drink and get home safe.
THAT is the type of person running gov offices.
WuWu
You do realize that you just wrote an article not about selfcruising vehicles, but about A.I., thus demolishing completely the case for “AI”. AI is clearly not possible with any currently( publicly) known technology.
Brother John
If never having fully autonomous, self driving cars is part of the price I have to pay for having a less authoritative, less intrusive, less forceful government, I’m fine with that.
John
Great article Corey. I just thought I’d weigh in on your comment about someone getting killed by a self driving car. That actually happened during trials in west Phoenix three or so years ago. A pedestrian was crossing the road in the middle of the block and because the computer hadn’t been programmed for an event like that the person was hit and killed before the human driver even had time to react.
RedSheep
I would not purchase a new vehicle. The prices are ridiculous. Having been in one of them as a passenger, I am reminded of being back in parochial grade school having my hands caned for a transgression(now you know my age group). Maybe people who are not paying attention whilst driving are in need, but I can’t imagine having a car directing the driving. I will stick to my analog car, which is easy to drive, looks good and does not annoy me with jangling sounds. It is good on gas and not difficult or expensive to repair. It will also start up and move when and if the grid goes down or some other control mechanism kicks in.
J Brown
I’m still waiting for the jet pack that I was promised decades ago, which would make all cars obsolete.