Tesla, Mode Confusion, and the Lack of ADAS Regulation

Last week I wrote about how a recent crash in Texas is illustrative of a serious issue with Tesla’s Autopilot feature. As a refresher, Autopilot is an advanced driver-assistance system (ADAS), meaning it can take over a number of driving tasks and help protect drivers, but a human is supposed to remain focused on the driving task. Failures within the Autopilot system have contributed to several fatal accidents, and Tesla drivers have at times abused the system, using it to operate their vehicle while sleeping, or when they were passed out due to drinking. But these incidents shine a light on bigger problems with Autopilot and ADAS systems in general – problems of public perception in regards to the capabilities of these systems and the lack of regulation dealing with them.

Public Perception

The first consideration in regards to Autopilot and the rest of ADAS/AV development is the question of the public’s perception of Autopilot and just what the technology can actually do. While some of these incidents are no doubt caused by drivers intentionally pushing the limits of the Autopilot system, others are likely cases of “mode confusion” – something previously seen in fighter pilots. Fighter aircraft can have multiple different modes of automation, and to prevent confusion pilots are highly trained to understand the capabilities and limits of each mode available in a given aircraft. The same is not true for drivers, especially when marketing materials and manuals are unclear or make a vehicle out to be more autonomous than it really is. Indeed, a German court recently found that Tesla’s claims about Autopilot were misleading, in a suit brought by a watchdog organization (of which several German automakers are members…). Indeed, the name “Autopilot” seems tailor-made to confuse consumers, though Tesla founder (and soon to be Saturday Night Live host?) claims the name is apt, as aircraft autopilot systems always require supervision. While that may be true, one does wonder how many Tesla owners or potential Tesla owners are up on aircraft operation procedures.

The issue is compounded by Tesla’s introduction of further automation technology and software, including what they call “Full Self-Driving” (FSD). While Tesla’s website and their manuals do state that a driver must be “fully attentive” the danger of confusion is clear (enough that the California DMV, which regulates AVs in that state, had to press Tesla to explain just what FSD did and why Tesla believed the DMV shouldn’t regulate it like other AV systems. It also doesn’t help that the systems in place to monitor drivers using Autopilot can rather easily be tricked, meaning that it’s extremely likely some drivers are intentionally circumventing the vehicle’s safeguards.

Problems for AV and ADAS development and deployment arise if Tesla’s incidents become the face these systems to the public. Tesla has a well-earned image as an innovative company, having pushed the electric vehicle market into a new era, and Autopilot is almost certainly preventing accidents when used properly, just as other ADAS systems do. But if all that cuts into the public consciousness is high profile abuses and deadly accidents, that could set back public trust of automation and harm the industry overall.

Lack of Regulation

There is a very important piece of the puzzle left to talk about – the government’s role in regulating Autopilot and vehicle automation overall. Much has been discussed in this blog about the lack of overall AV regulation, and ADAS has fallen into the same situation – namely the technology is out on the road while safety regulations remain in the draft folder. As I mentioned last week, the National Transportation Safety Board (NTSB), in its 2019 investigation of an Autopilot-related crash, took to task the government’s primary vehicle safety regulator, the National Highway Traffic Safety Administration (NHTSA), for failing to generate clear ADAS safety rules. During the Trump administration, NHTSA remained extremely hands off on generating new rules or regulations regarding ADAS. In 2016, Obama-era NHTSA regulators had indicated that “predictable abuse” could be considered a potential defect in an automated system, which could have flagged Autopilot misuse, but that guidance wasn’t followed up. It remains to be seen if the Biden administration will pivot back toward regulating automation, but given that ADAS accidents continue to occur and garner a lot of attention, they may be unable to ignore it for long. For now, state-level regulators can attempt to fill the gap, as seen in the California DMV emails linked above, but their power is limited when compared to the federal government.

It’s hard to tell how all of this will end. Tesla seems uninterested in pulling back and testing their automation systems in more controlled environments (as other automakers do), and instead continues to push out updates and new tech to the public. Perhaps that level of bravado is to be expected from a company that made its name in challenging the existing paradigm, but that doesn’t excuse the fact that the Federal government has yet to step in and lay down clear rules on ADAS systems. For the sake of the promising safety-benefits of ADAS and automation overall those rules are needed soon, not only to protect the public, but to ensure confidence in these emerging technologies – something that would be to the benefit Tesla and other automakers.

P.S. – I’ll leave you with this – a perfect illustration of absurdity of how vehicle features are named.

Leave a Reply

Your email address will not be published. Required fields are marked *