The fatal crash involving Tesla Inc.'s Autopilot driver-assist technology highlights the need for more education about such systems.
When a Tesla Model X owner in Mountain View, Calif., died after his vehicle struck a highway median while Autopilot was engaged, the automaker backed its system in a blog post, blaming the crash on driver error and problems with a road barrier. The fatality occurred in the wake of a Tempe, Ariz., crash in which an Uber autonomous test vehicle killed a pedestrian, prompting experts to call for more intensive education for drivers.
"It's a matter of drivers having to learn these systems and how to use them," said Jeff Soble, a partner at the Foley & Lardner law firm specializing in product liability. "I think there's a perception I can get into a Model S and push a button and not pay attention, but that's just not how they operate."
The fatal Tesla crash happened March 23. Four days later, the National Transportation Safety Board said it was investigating the incident. Three days later, Tesla posted the blog, asserting that Autopilot had been engaged at the time of the accident — a detail the NTSB had not released. Tesla said the system remains safer than standard human driving.
"No one knows about the accidents that didn't happen, only the ones that did," the blog read. "The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe."
The blog restated a claim from a NHTSA report on a previous crash that the first version of Autopilot could reduce crash rates up to 40 percent.
Although driver-assist features can increase vehicle safety, Bryant Walker Smith, a University of South Carolina law professor who specializes in self-driving vehicles, said automakers should make clear what their systems can and cannot do, and be transparent about how safety calculations are made.
"Tesla needs to substantiate their claims," Walker Smith said. "Anyone who looks at statistics wants to see numbers and nuance, and an acknowledgement of that."
Such specifics include which systems Autopilot is being compared against, whether the setting is urban or rural and which version of Autopilot is being studied. Another question, he added, is, "Can it be even safer?"
Making such information public can help consumers understand how their vehicles function, as well as allow for more specific media reporting on the technology, Walker Smith said. Companies also need to define who is responsible for safely operating a vehicle.
"When everyone is responsible, no one is responsible," he said. "Companies are saying their system doesn't have to be perfect because there's a human monitoring it. Drivers are thinking they don't have to be alert because the system is there."