Divergence in Capabilities, Distracted Driving, and Derelict Governance

The California DMV recently released several 2019 reports from companies piloting self-driving vehicles in California. Under state law, all companies actively testing autonomous vehicles on California public roads must disclose the number of miles driven and how often human drivers were required to retake control from the autonomous vehicle. Retaking control is known as “disengagement.” The DMV defines disengagements as:

“[D]eactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.”

Because of the proprietary nature of autonomous vehicle testing, data is not often publicly released;  this is one of the few areas where progress data is made publicly available. The 60 companies actively testing in California cumulatively traveled 2.88 million miles in 2019. The table below reports the various figures for some of the major testers in California.

CompanyVehicles Active in CAMiles Driven in 2019EngagementsEngagements per 1,000 milesAverage Miles Between Engagements
Waymo 153 1.45 Million 110 0.076 13,219
GM Cruise 233 831,040 68 0.082 12,221
Apple 66 7,544 64 8.48 118
Lyft 20 42,930 1,667 38.83 26
Aurora ? 13,429 142 10.57 95
Nuro 33 68,762 34 0.494 2,024
Pony.ai 22 174,845 27 0.154 6,493
Baidu 4 108,300 6 0.055 18,181
Tesla 0 0 0 0 0

What these numbers make clear is that there are several contenders who have made significant progress in the autonomous vehicle space, and there are some contenders which are not yet so competitive. Companies like Waymo, GM Cruise, and Baidu (which also tests extensively in China) have made incredible progress in decreasing the frequency at which a driver must engage with an automated vehicle. Others, like Apple, Lyft, and Aurora, while making progress, are nowhere near as sophisticated in avoiding engagements yet. Noticeably Tesla, the manufacturer frequently in the news for its “Autopilot” feature, does not test on public roads in California. The company says it conducts tests via simulation, on private test tracks, public roads around the world, and “shadow-tests” by collecting anonymized data from its customers during normal driving operations.

What these numbers seem to illustrate is that the autonomous vehicle industry is not all on par, as many often believe. It is often said that Henry Ford did not conceive the idea of an automobile; he perfected it. Similarly, companies like Waymo or GM may be the first to perfect autonomous vehicles, and gain an incredible market advantage once they do so. They are striving to be the Ford’s in this space, while others look like they’re still manufacturing carriages. However, despite these impressive numbers from a select few, the companies themselves think these metrics “do[] not provide relevant insights” (per Waymo) and that the idea that they give any “meaningful insight . . . is a myth” (per GM Cruise).

Why are the head and shoulder leaders on these metrics saying that they provide very little indication of progress on the technology? Disengagement reports may not be the best way for these companies to build trust and credibility in their products. They are only transparent in that they provide some data with no detail or context.

I was having a conversation about these disengagement numbers with a colleague* this week, and the topic of driver distraction arose. In the CA tests, the driver is constantly alert. Once these vehicles are in use for the general public, a notification to engage may not be effective if the driver is distracted. One reason these numbers do not provide particularly useful information is that for the metrics to be useful, at least two things must be true:

  • If the vehicle does not indicate it needs to disengage, no technical errors have been made; and
  • The driver is paying attention and can quickly engage when necessary.

In California testing, the drivers behind the vehicle are always alert and ready to take over. They may take over when the vehicle indicates they must, because of a malfunction or poor conditions. The driver can also engage when the vehicle has done something incorrectly, yet does not indicate that the driver needs to take over. This could include veering into a lane or failing to recognize a pedestrian.

One of the allures of autonomous vehicles is that a driver may not need to be 100 percent engaged for the vehicle to function correctly. However, current technology has not yet achieved this  result, as reiterated this past week by the National Transportation Safety Board (NTSB). The NTSB is an independent federal agency, which lacks enforcement power, but makes recommendations which are considered thorough and are taken seriously by policymakers.

The NTSB put forward many findings on Tuesday, February 25th regarding a Tesla crash that killed a California driver in March 2018. (A synopsis of the NTSB report and findings can be found here.) The facts of the crash involved driver of a Tesla in Autopilot mode, which struck a barrier between the highway and a left exit lane. NTSB found that the Tesla briefly lost sight of the lines marking the highway lane, and started to follow the right-most lane marker of the exit lane (because of fading on the highway lines) caused the vehicle to enter the “gore area.” This same action had apparently occurred several times in this exact vehicle, but the driver on previous trips was paying attention and was able to correct the vehicle. This time, the driver was playing a mobile game and did not correct the vehicle, causing the crash. Here was how NTSB presented three of their findings:

The Tesla’s Autopilot lane-keeping assist system steered the sport utility vehicle to the left into the neutral area of the gore, without providing an alert to the driver, due to limitations of the Tesla Autopilot vision system’s processing software to accurately maintain the appropriate lane of travel. (emphasis added)

The driver did not take corrective action when the Tesla’s Autopilot lane-keeping assist system steered the vehicle into the gore area, nor did he take evasive action to avoid the collision with the crash attenuator, most likely due to distraction by a cell phone game application. (emphasis added)

The Tesla Autopilot system did not provide an effective means of monitoring the driver’s level of engagement with the driving task.

Here we see a combined failure of both (1) and (2) presented above, combined with an inability to adequately monitor driver engagement. The vehicle took an action which it assumed to be correct, and thus did not notify the driver to take over. This combined with the driver not paying attention, failing to notice the need to disengage, and resulted in the crash. This tragic accident highlights that the AV industry still has many areas to improve before higher SAE level vehicles are ready for mass adoption. (The ADAS on the Tesla was SAE Level 2)

As I discussed last week, the federal Department of Transportation has taken a rather hands-off approach to regulation of automated vehicles, preferring to issue guidance rather than mandatory regulations. The  National Transportation Safety Board (NTSB) criticized this approach in their Tesla crash findings. The NTSB wrote that there has been “ Insufficient Federal Oversight of Partial Driving Automation Systems.”

The US Department of Transportation and the National Highway Traffic Safety Administration (NHTSA) have taken a nonregulatory approach to automated vehicle safety. NHTSA plans to address the safety of partial driving automation systems through enforcement and a surveillance program that identifies safety-related defect trends in design or performance. This strategy must address the risk of foreseeable misuse of automation and include a forward-looking risk analysis.

Because the NTSB lacks enforcement power, it cannot compel industry actors or other government agencies to take any action. It can only perform investigations and make recommendations. NTSB Chairman Robert Sumwalt had much to say regarding distracted driving, the AV industry, and the lack of government regulations in the hearing on Tuesday, February 25th.

“In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss,”

“Industry keeps implementing technology in such a way that people can get injured or killed . . . [I]f you own a car with partial automation, you do not own a self-driving car. Don’t pretend that you do.”

“This kind of points out two things to me. These semi-autonomous vehicles can lead drivers to be complacent, highly complacent, about their systems. And it also points out that smartphones manipulating them can be so addictive that people aren’t going to put them down,”

Chairman Sumwalt is right to be frustrated. The DOT and NHTSA have not regulated the AV industry, or ADAS as they should. Tragic accidents like this can be avoided through a variety of solutions; better monitors of driver engagement than torque-sensing steering wheels, lock-out functions for cell-phones when driving, stricter advertising and warning regulation by companies offering ADAS. Progress is being made in the AV industry, and automated vehicles are getting smarter and safer every day. But incidents like this that combine a failure of technology, regulation, and consumer use, do not instill public confidence in this incredible technology that will be beneficial to society. It only highlights how much farther we still have to go.

*I would like to thank Fiona Mulroe for the inspiration to take this approach to the disengagement report

Leave a Reply

Your email address will not be published. Required fields are marked *