The Wall Street Journal has published an interview with Jennifer Homendy, chair of the National Transportation Safety Board (NTSB), and she had a lot to say about Tesla’s Autopilot and Full Self-Driving. Although the initial article is paywalled, other news sites such as The Verge and Engadget shared bits and pieces of that interview.
The Verge noted that Homendy said that Tesla’s use of the term “full self-driving” is “misleading and irresponsible.” She added that Tesla “has clearly misled numerous people to misuse and abuse technology.”
Engadget noted that Homendy also emphasized that Tesla needs to address “basic safety issues” before expanding Autopilot and FSD to more parts of the road.
Engadget pointed out that although this is Homendy’s personal opinion, it may not lead to policies that would limit or even ban Tesla’s technology, which is something critics are rooting for. But it does set the tone of the NTSB’s approach to Tesla at this time.
“Basic Safety Issues”
What are these basic safety issues that Tesla needs to address? What I can gather is that these safety features are the names Tesla uses for these features (already noted above) and the fact that people are misusing Autopilot. Note that Elon Musk recently stated there have been no accidents involving the Full Self-Driving (FSD) Beta features that a couple thousand owners have been using for several months.
One of the key basic safety issues that do plague Tesla is one that Tesla isn’t at fault for, and that is distracted driving. The problem has long predated Tesla (probably existing since the week the first automobile was moving) and is a major problem across society.
Tesla implements constant reminders that the driver needs to be alert and aware at all times, and makes them routinely move the wheel to prove that they are. Tesla includes a variety “nags” to its software to snap distracted drivers out of a daze if they fall into one. It has different degrees of alerts for when drivers need to take over.
This is what Tesla states on its website regarding Autopilot:
“Current Autopilot features require active driver supervision and do not make the vehicle autonomous.”
And this is what Tesla states regarding FSD:
“With Full Self-Driving (FSD) capability, you will get access to a suite of more advanced driver assistance features, designed to provide more active guidance and assisted driving under your active supervision.”
Although Tesla’s goal is to develop FSD to become fully autonomous, to where it doesn’t need your active supervision, it is not there yet. This means that you still need to pay attention. I am not sure how much clearer Tesla can be on this.
And the reason the suite is called Full Self-Driving (FSD) is because the hardware is in the car that is expected to enable fully autonomous driving once the software is capable of it.
Homendy has also strongly endorsed a podcast discussion about “autonowashing” in recent weeks. CleanTechnica‘s Jennifer Sensiba recently wrote about this topic when writing about that specific podcast. “Autonowashing was a term coined by Liza Dixon, and it refers to the overstatement of autonomous capabilities by a manufacturer or its fans. This is a problem because people who misunderstand a system’s capabilities gain too much trust in the system, and that sometimes leads to tragedy. Nobody wants that.” That is basically Homendy’s chief complaint about Tesla, and the question is whether she is right or Tesla adequately warns people about limitations of its semi-autonomous driving systems and implores that drivers maintain attention on the task at hand — driving.
Critics want Tesla to change the name of “Autopilot” while looking the other way as other automakers include similar names for systems that allow drivers that same sense of relief from the burden of physically driving the car — even as those systems are not as good as Tesla’s. It’s a double standard, and let’s see how the NTSB handles it.
I also find it odd that the NTSB doesn’t acknowledge the fact that Tesla’s Autopilot has saved more lives than it has “taken.” I say that in quotes because it seems that most headlines regarding Tesla accidents and Autopilot have proven to be misleading or completely false clickbait. The highly covered (well, mis-covered) Houston crash earlier this year is a prime example of this. In more recent news, Tesla’s Autopilot saved the life of a drunk driver and the police who were able to stop the car. Autopilot stopped the car when the police units pulled in front of the moving vehicle after the driver had passed out drunk. You can read more about that here.
Punishing Tesla for the actions of some of its drivers, I think, is silly. And the criticism displayed here, in my opinion, often stems from dislike to strong bias against Elon Musk.
Appreciate CleanTechnica’s originality? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.