The National Transportation Safety Board on Tuesday will convene its second hearing on a fatal crash involving Tesla Inc.’s automated driver-assist technology even though the automaker hasn’t filed formal responses to recommendations stemming from the first one more than two years ago.
The NTSB in 2017 recommended that automakers including Tesla make their driver-assist systems more resilient to misuse by inattentive drivers and limit the operation of those systems to only the driving for which they were designed.
Automakers -- including Volkswagen Group, Nissan Motor Co. and BMW -- have told the NTSB how their systems ensured driver engagement, which the agency deemed acceptable responses. Tesla has had no formal correspondence with NTSB officials responsible for monitoring how safety recommendations are implemented, NTSB spokesman Chris O’Neil said.
“It’s not the norm,” O’Neil said. “Most recommendation recipients respond in the prescribed 90-day window.”
Tesla didn’t respond to a request for comment but has said it updated Autopilot in part to issue more frequent warnings to inattentive drivers.
The role of Tesla’s automated driver-assist features known as Autopilot, along with other factors including driver distraction and highway infrastructure, will be examined at an NTSB meeting on Tuesday. The agency is probing the March 2018 crash in Mountain View, Calif., that killed 38-year-old Apple Inc. engineer Walter Huang after his Tesla Model X slammed into a highway barrier while using Autopilot.
The investigation was marked by an unusually public display of tensions between the agency and Tesla CEO Elon Musk that peaked when the agency kicked Tesla off the probe after he released information about the crash despite prohibitions against such disclosures.
The hearing could hold lessons for the auto industry as automated driving features are becoming increasingly common on new vehicles. Several other automakers have also equipped their vehicles with technologies that can provide automated steering, acceleration and braking, and some have installed systems to ensure drivers pay attention. General Motors and Subaru Corp. use infrared cameras to track head and eye movement, and Nissan last year said it would include a similar driver monitor in a system designed to offer hands-free driving on the highway.
Tesla has said Autopilot makes drivers safer, pointing to internal data it releases quarterly that it says demonstrates drivers crash less frequently while using it than while driving manually. The company stresses that drivers must remain attentive with their hands on the wheel while using Autopilot, which monitors by sensing steering wheel inputs by the driver.
The company has said it has adjusted the the warnings drivers receive if their hands are off the wheel for too long, which federal investigators have faulted for being easy to sidestep.
In 2017, the NTSB closed its first probe of a fatal crash linked to Autopilot by calling on companies to develop ways to better ensure drivers pay attention while using automated driving features that require human supervision
The recommendations stemmed from the agency’s probe of a 2016 crash in which former Navy SEAL Joshua Brown died after his Tesla Model S crashed into a commercial truck crossing the road in front of him on a Florida highway while using Autopilot. The agency cited an over-reliance on the car’s automation by Brown and a lack of built-in safeguards to prevent inattention as key factors that contributed crash.
Last fall, the NTSB again cited inattention and Autopilot’s design in a January 2018 crash in which a Tesla driver rear-ended a parked fire truck on a freeway near Los Angeles. The agency said Autopilot’s design allowed the driver, who was uninjured in the crash, to stop paying attention to the road.
After that crash, Tesla said it has updated Autopilot in part to issue more frequent warnings to inattentive drivers. The company has also been in regular contact with NTSB investigators and provided information about its systems to the agency, O’Neil said.
“That doesn’t replace the need for formal responses to safety recommendations,” he said. “It’s a process designed to help us understand what they’re doing to implement those safety recommendations and what their progress toward them are, which may inform whether we feel other recommendations are necessary.”
Records from the Mountain View investigation hint at several factors the NTSB could highlight during the meeting Tuesday. With Autopilot engaged and set to cruise at 75 mph, Huang’s 2017 Tesla Model X sped up and slammed into a concrete barrier. Vehicle data showed neither the driver nor the vehicle’s automatic systems applied the brakes prior to impact, the NTSB has said.
Huang had complained that Autopilot had repeatedly veered his vehicle toward the same spot during earlier trips on that same stretch of highway, according to the agency. Data taken from his Tesla’s computer confirmed that the situation had occurred at the same location four days before the fatal crash and once more several weeks earlier, records released by the NTSB show.
The tip of the concrete lane divider struck by Huang’s Tesla was supposed to have been protected by a crash attenuator, a device attached to highway infrastructure to absorb impact forces like a car’s crumple zone. It was damaged 11 days earlier and hadn’t been repaired by the California Department of Transportation before Huang’s crash.
Records reviewed by NTSB found Huang was playing a game on his Apple-provided mobile device before the collision, the agency said, citing data transmission records. However, the data couldn’t show how engaged he was with the game or whether he was holding the device with both hands at the time of the crash, the NTSB said.
Crash investigators at NHTSA have opened 14 inquiries into Tesla crashes believed to involve Autopilot, plus 11 more involving other manufacturers with partial-automation systems.