Go Fast and Break People
On November 19, the NTSB held a public board meeting on the 2018 Uber accident in Tempe, Arizona, involving an “automated” (actually level 3) Uber-operated Volvo SUV. One woman, Elaine Herzberg, a pedestrian, died in the accident. In the wake of the report, it is now a good time to come back to level 3 cars and the question of “safety drivers.”
Given that the purpose of the meeting was to put the blame on someone, media outlets were quick to pick up a culprit for their headlines: the “safety driver” who kept looking at her phone? The sensors who detected all kinds of stuff but never a person? Uber, who deactivated the OEM’s emergency braking? Or maybe, Uber’s “safety culture”? A whole industry’s?
The Board actually blames all of them, steering clear of singling out one event or actor. It is probably the safest and most reasonable course of action for a regulator, and it has relevant implications for how law enforcement will handle accidents involving AVs in the future. But because we are humans, we may stick more strongly with the human part of the story, that of the safety driver.
She was allegedly looking at her phone, “watching TV” as one article put it; following the latest episode of The Voice. The Board determined that she looked at the road one second before the impact. That is short, but under more normal circumstances, enough to smash the brakes. Maybe her foot was far from the pedal; maybe she just did not react because she was not in an “aware” state of mind (“automation complacency,” the report calls it). In any case, it was her job to look on the road, and she was violating Uber’s policy by using her phone while working as a safety driver.
At the time of the accident, the Tempe police released footage from the dash cam, a few seconds up to the impact, showing a poorly-lit street. The relevance of this footage was then disputed in an Ars Technica article which aims to demonstrate how actually well lit the street is, and how just the front lights of the car should have made the victim visible on time. Yet, I think it is too easy to put the blame on the safety driver. She was not doing her job, but what kind of job was it? Humans drive reasonably well, but that’s when we’re actually driving, not sitting in the driver seat with nothing else to do but to wait for something to jump out of the roadside. Even if she had been paying attention, injury was reasonably foreseeable. And even if she would have been driving in broad daylight, there remains a more fundamental problem besides safety driver distraction.
“The [NTSB] also found that Uber’s autonomous vehicles were not properly programmed to react to pedestrians crossing the street outside of designated crosswalks” one article writes. I find that finding somewhat more appalling than that of a safety driver being distracted. Call that human bias; still I do not expect machines to be perfect. But what this tells us is that stricter monitoring of cellphone usage of safety drivers will not cut it either, if the sensors keep failing. The sensors need to be able to handle this kind of situation. A car whose sensors cannot recognize a slowly crossing pedestrian (anywhere, even in the middle of the highway) does not have its place on a 45-mph road, period.
If there is one thing this accident has shown, it is that “safety drivers” add little to the safety of AVs. It’s a coin flip: the reactivity and skill of the driver makes up for the sensor failure; in other cases, a distracted, “complacent” driver (for any reason, phone or other) does not make up for the sensor failure. It is safe to say that the overall effect on safety is at best neutral. And even worse: it may provide a false sense of safety to the operator, as it apparently did here. This, in turn, prompts us to think about level 3 altogether.
While Uber has stated that it has “significantly improved its safety culture” since the accident, the question of the overall safety of these level 3 cars remains. And beyond everything Uber can do, one may wonder if such accidents are not bound to repeat themselves should level 3 cars see mass commercial deployments. Humans are not reliable “safety drivers.” And in a scenario that involves such drivers, it takes much less than the deadly laundry list of failures we had here to have such an accident happen. Being complacent may also mean that your foot is not close to the pedals, or that your hands are not “hovering above the steering wheel” as they should (apparently) be. That half second extra it takes to smash the brakes or grip the wheel is time enough to transform serious injury into death.
The paramount error here was to integrate a human, a person Uber should have known would be distracted or less responsive than an average driver, as a final safety for sensor failure. Not a long time ago, many industry players were concerned about early standardization. Now that some companies are out there, going fast and literally breaking people (not even things, mind you!), time has come to seriously discuss safety and testing standards, at the US federal and, why not, international level.
A University of Michigan Law School Problem Solving Initiative class on AV standardization will take place during the Winter semester of 2020, with deliverables in April. Stay tuned!