The Uber self-driving car crash that killed a pedestrian in March 2018 was the fault of the vehicle’s operator, who wasn’t paying attention at the time and was likely looking at her cell phone, the National Transportation Safety Board has determined. But the safety watchdog didn’t end the blame game there. At a board meeting Tuesday afternoon in Washington, DC, it said that a slew of terrible decisions—by Uber, the state of Arizona, and the federal government—contributed to the death of Elaine Herzberg, the 49-year-old woman who was fatally struck.
The safety board, which has no regulatory power, also issued a series of recommendations its members believe will help avoid a repeat crash. The six prompts—to the National Highway Traffic Safety Administration, which is charged with overseeing vehicle safety in the US; to Arizona, which has very few rules governing automated vehicle testing; to the organization that oversees local law enforcement and motor vehicle departments; and to Uber itself—show that the safety board is pushing for regulation of self-driving vehicles on public roads, including codifying today’s federal guidelines into proper, enforceable rules.
But not too much regulation, which self-driving vehicle developers say might stall progress and prevent the rollout of what they call a life-saving technology. “We haven’t really put the meat cleaver to this and tried to stifle innovation,” NTSB Chair Robert Sumwalt told reporters after the board meeting Tuesday. “We’re just trying to put some bounds on the testing on the roadways.”
First, the safety panel thinks NHTSA should do more to gauge how self-driving developers are running their test operations on public roads. NHTSA’s guidelines for testing robocars are a set of principles rather than a blueprint for safety, and while the agency invites companies to submit self-assessment safety reports, it doesn’t evaluate those. As a result, the 16 voluntary safety assessment letters that AV companies have submitted “are kind of all over the place,” NTSB investigator Ensar Becic said at the meeting. (Sixty-two companies are registered to test their robot vehicles in California.) “Some have a good amount of detail, while others quite frankly read like marketing brochures.” Jennifer Homendy, one of three NTSB board members, called the setup “laughable.”
The board voted unanimously to recommend that NHTSA make those reports mandatory, and create a process for actually assessing them. In a statement, NHTSA said it’s still working on its own investigation into the crash, and that it will “carefully review” the NTSB’s report and recommendations.
The NTSB also recommended that Arizona demand that self-driving developers submit an application detailing how they’ll manage the risks that come with testing on public roads, before granting them permission to release their robots. The state banned Uber’s program after the Tempe, Arizona, crash, but hasn’t adjusted its very light regulatory regime, which asks simply that test vehicles have standard registrations. In a statement, Patrick Ptak, a spokesperson for Governor Doug Ducey, said “Arizona appreciates the work done by the NTSB, and we are reviewing the case docket and recommendations carefully.” He also noted that the Grand Canyon State is the only one to tell Uber to scram. The state’s Department of Transportation is also reviewing the recommendations, a spokesperson says.
Those recommendations stem from the NTSB’s determination that Uber’s self-driving program had major shortcomings. On the tech side, Uber hadn’t programmed the SUV that killed Elaine Herzberg to look for pedestrians outside of crosswalks. It had limited its cars’ ability to slam on the brakes, out of fear they’d cause problems by stopping for no good reason. It relied on the vehicle operator sitting behind the wheel to keep everybody safe. But, the NTSB found, didn’t adequately address the well-established risk of “automation complacency”—humans’ tendency to lose focus when asked to monitor boring things like a car circling the suburbs. Uber had driver-facing cameras, but rarely reviewed their footage. And a few months before the crash, it started putting just one human in each car, whereas previously, like most self-driving developers, it had two. Moreover, Uber didn’t have an operational safety division or safety manager.
In the 19 months since the deadly crash, Uber has scaled back and revamped its testing program. It reverted to two operators per car, hired a third party to check on their attentiveness in real time, created a safety division, and taught its cars how pedestrians actually move through the world. And, unlike Elon Musk’s Tesla, Uber fully cooperated with the investigation, Sumwalt said.
The NTSB also noted that Herzberg had methamphetamine in her system, but couldn’t say how that would have affected her behavior. And while Homendy brought up the fact that Herzberg followed a paved path to the point where she crossed the road, the NTSB’s findings and recommendations didn’t address street design. It settled for asking the folks who run the streets and the cars that use them to impose just a bit more safety sense.
More Great WIRED Stories
- Star Wars: Beyond the Rise of Skywalker
- How the dumb design of a WWII Plane led to the Macintosh
- Hackers can use lasers to “speak” to your Amazon Echo
- Electric cars—and irrationality—just might save the stick shift
- China’s sprawling movie sets put Hollywood to shame
- 👁 A safer way to protect your data; plus, check out the latest news on AI
- ✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers.