WASHINGTON -- Uber had disabled an emergency braking system in a self-driving vehicle that struck and killed a woman in Arizona in March after failing to identify the pedestrian, the National Transportation Safety Board said in a preliminary report released on Thursday.
The report said the modified 2017 Volvo XC90's radar systems observed the pedestrian six seconds before impact but "the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle."
At 1.3 seconds before impact, the self-driving system determined emergency braking was needed. But Uber said emergency braking maneuvers were not enabled while the vehicle was under computer control in order to reduce the potential for erratic vehicle behavior. The Volvo XC90 is typically equipped with automatic emergency braking systems designed to prevent frontal crashes.
Uber Technologies Inc., which voluntarily suspended testing in the aftermath of the crash in the city of Tempe -- the first death involving a fully self-driving vehicle -- said on Wednesday it would shut down its Arizona self-driving testing program and focus on limited testing in Pittsburgh and two cities in California.
Arizona's governor in March had suspended Uber's permit for the testing, citing safety concerns.
The company did not directly comment on the NTSB findings but noted it recently named a former NTSB chairman, Christopher Hart, to advise on Uber's safety culture.
"As their investigation continues, we’ve initiated our own safety review of our self-driving vehicles program," the company said on Thursday, adding that it planned to announce changes in the coming weeks.
All aspects of the self-driving system were operating normally at the time of the crash, and there were no faults or diagnostic messages, the NTSB said.
The report gives new fuel to opponents in Congress who have stalled a bill designed to speed the deployment of self-driving cars on U.S. roads and puts a spotlight on the fact that the National Highway Traffic Safety Administration, which is also investigating, does not test self-driving vehicles or certify them before they are deployed on U.S. roads.
William Wallace, senior policy analyst for Consumers Union, the advocacy division of Consumer Reports, called Uber "reckless" and said the NTSB report "makes it clear that a self-driving car was tested on public roads when it wasn't safe enough to be there, and it killed a pedestrian." He added that the system "was far too dangerous to be tested off a closed track."
The report noted that Herzberg tested positive for methamphetamine and marijuana, and that she did not look in the direction of the vehicle until just before impact.
The NTSB did not say when it would release its final report on the accident. The agency typically issues its final conclusions at least a year after an accident.
It is also investigating a series of crashes involving Tesla Inc.'s semi-autonomous "Autopilot" system after faulting the system last year after a fatal crash in Florida.