Motor Mouth: What will it take for us to regulate Tesla's autonomous car tech?
Even 'Consumer Reports' has demonstrated a Tesla can be made to drive itself with no one behind the wheel
Article content
Unless you’ve been hiding under a rock these last five days, you know there was yet another fatal crash involving a Tesla and the company’s (in)famous Autopilot system . What made this particular incident even more egregious than previously reported Tesla self-driving fatalities is that officials seem certain “no one was driving the vehicle at the time of the crash.”
Advertisement
Article content
If the initial police reports are to be believed, two older Texas men — sobriety not determined, but it was late at night on a Saturday — jumped into a 2019 Model S, ostensibly so the owner could show off how cool Autopilot was. What made this particular tragedy even more tragic was, according to the police — and despite Elon Musk’s claim Autoplilot wasn’t engaged at the time — they believe that before the crash, one man jumped into the right front passenger seat while the other sat in the rear.
Yes, you’re reading that right; there was no human directly behind the steering wheel. So, it’s little surprise that, according to the accident report, said Tesla missed a curve and left the road at a “high rate of speed,” struck a tree, burst into fire — which took up 32,000 gallons of water to extinguish! — and killed both onboard. What is different is that, for probably the first time in automotive history, one can say legitimately say everyone in the car was killed without being able to distinguish driver from passenger(s).
Advertisement
Article content
Before we start analyzing the ramifications this tragedy might have for Tesla — the company is in full denial — autonomous driving in general (already suffering from growing consumer antipathy) and, most importantly, the government agencies and regulators supposedly monitoring such technologies, let’s just for the briefest of moments consider what happened here.
I realize in these days of hyper-progressive politics, that it’s au courant to treat the concept of personal responsibility like some quaint, soon-to-be-dismissed anachronism from the days of Robert E. Lee. Nonetheless, the facts remain that two supposedly sentient human beings got into a high-powered vehicle — because, along with boasting about its self-driving technology, Tesla makes much of its products’ performance — to “test” a self-driving technology without someone behind the wheel. Hopefully, we can all at least agree this wasn’t the smartest thing to do.
Certainly, to posit they weren’t aware of the danger would seem to stretch credibility. To claim, for instance, they did not understand that Autopilot is not accredited with full Level 5 autonomy — being able to drive anywhere in any conditions without any need for human intervention — is almost statistically impossible. With reports of Tesla’s issues with self-driving making front-page news, and the company’s own admonishment that Autopilot requires human supervision, claiming “I didn’t know” a human had to be behind the wheel is simply not credible. It’s akin to someone claiming they didn’t know that cigarettes cause cancer.
Advertisement
Article content
That said, this sort of thing is happening far too often with Teslas. Officially, as Tesla is wont to point out, Autopilot is to be continually supervised by a human being, preferably — I would say obviously, but I can already picture the inane safety warning that will result from this tragedy — behind the wheel. Yet, as Wired , points out, “There’s a small cottage industry of videos on platforms like YouTube and TikTok where people try to ‘fool’ Autopilot into driving without an attentive driver in the front seat. ”
Some have tried lodging various fruits in the arms of the steering wheel (they weren’t heavy enough). Others have had more success with exercise hand weights . And as Wired notes, still more have demonstrated that a Tesla can be made to Autopilot without anyone in the driver’s seat by simply connecting the driver’s side seatbelt . Even Consumer Reports got in on the spiel, demonstrating how they got a Model X to drive — thank God, they were using their closed-course test track — without anyone behind the wheel, with a variation of the hand weight trick.
Advertisement
Article content
As to why they do it, it’s simply the result of years of being told that Teslas already have the ability to self-drive. Indeed, Musk has a history of making grandiose claims for Autopilot, most recently telling podcaster Joe Rogan that he thinks “Autopilot’s getting good enough that you won’t need to drive most of the time unless you really want to.”
Yet, as Russ Mitchell’s recent “Tesla touts self-driving to consumers. To the DMV, it tells a different tale” expose in the Los Angeles Times notes, regulators don’t get quite the same pitch. “For years, Tesla Chief Executive Elon Musk has been telling the public that fully autonomous Teslas are just around the corner, no more than a year or two off,” all the while “telling regulators a very different story.”
Advertisement
Article content
In fact, according to Mitchell, “in official correspondence with California’s Department of Motor Vehicles, Tesla lawyers recently admitted the $10,000 option that Tesla sells as ‘Full Self-Driving Capability’ is not, in fact, capable of full self-driving.” Indeed, Tesla’s legal beagles officially told the California DMV they did not expect any “significant advancements” that would allow full self-driving, and that Autopilot would “continue to be an SAE Level 2 advanced driver-assist feature.” For those not having a Society of Automotive Engineers’ handbook handy, this means Autopilot cannot reliably function on any road under any conditions without human supervision.
Marrying America’s wild-wild-west enforcement of safety regulations with the reality of self-driving automobiles is a recipe for disaster
Still, this tragedy brings a very important ethical question to the fore. No, I’m not talking about the famed “trolley” hypothesis in which the robot-driven car has to choose between killing two elderly grandparents or a fresh-as-a-daisy infant. Rather, we really do need to ask ourselves what an acceptable fatality rate for robot-driven cars might be.
Advertisement
Article content
So far, the public has been sold a bill of goods, futurists and automakers alike — Volvo, for instance — implying that fully autonomous automobiles could completely eliminate automobile fatalities. Now, let’s be clear: No matter how powerful the supercomputer or how accurate the seeing-eye sensor, such claims are fantasies that do nothing other than to put unrealistic expectations into the public domain. As Musk — and countless other autonomous automobile advocates — so rightly note, there’s a smell of hypocrisy in the air when two people dying in a Model S garners top-of-the-fold headlines while almost 100 Americans a day die in motor vehicle crashes in total anonymity. Automotive autonomy has been over-hyped so assiduously that consumers are not only convinced it is just around the corner but that it will also be infallible. Little wonder, then, that any intrusion into that fallacy sends us reeling.
Advertisement
Article content
And finally, where are the regulators in all of this mess? Self-driving is perhaps — no, make that definitely — the most ground-breaking technology in the history of privately owned, personal mobility. It is almost certainly the one with the most potential for calamity. And yet virtually every branch of government is sitting on its hands rather than making the hard decisions required to regulate autonomous driving. And make no mistake, autonomous automobiles need more regulation. I know I started this discussion with a rant about personal responsibility, but even an avowed Libertarian — which certainly describes Yours Truly — understands this is not a technology that can be self-policed by private industry. Lord knows Tesla’s doublespeak with regard to its Autopilot system surely puts paid to that notion.
Advertisement
Article content
If these tragedies are to be eliminated — and, as I said, we as a society need to discuss what an acceptable fatality rate is for self-driving cars — regulators need to take action. Whether said regulations involve mandating specific technologies — Musk insists Autopilot does not require lidar when pretty much the rest of the automotive world agrees it does — or some form of centralized testing before any advanced driver aid is allowed for sale, I do not know. But I do know that marrying America’s wild-wild-west enforcement of safety regulations with the reality of self-driving automobiles is a recipe for disaster.
Just as importantly, any existing regulations need to be enforced. In its recent “The future of work,” The Economist recently posited that American companies that pretend their workers are self-employed when they are more like employees “have not found loopholes in existing employment law, as is often believed, but, instead, they act with impunity mainly because enforcement is weak and punishment is feeble.” As Mitchell so diligently points out, California’s regulations already “bar companies from advertising the sale or lease of a vehicle as autonomous” if it “will likely induce a prudent person to believe a vehicle is autonomous.”
An option called Full-Self Driving Capability, offered on select Teslas, would certainly seem to fit the description. Why are we still sitting on our hands?
LISTEN: Separating fact from fiction of electric vehicles
Subscribe for free to Plugged In on Apple Podcasts, Spotify, and Stitcher.