"In my mind, the thing that matters is preventing the next crash, and none of the specifics of the technology here seem likely to have a role in the next death," he said. What matters is, " 'Is somebody going to try this again?' Of course. Will one of them eventually get unlucky enough to die unless something changes? Seems pretty likely."
Koopman offers education as one potential solution. In January, he authored a paper that proposed new language for discussing the capabilities and limitations of automated-driving systems. They are usually classified using SAE International's engineering-minded Levels of Automation, from Level 0 to Level 5.
Koopman favors more consumer-friendly classifications: assistive, supervised, automated and autonomous.
Such terminology could indeed provide an underpinning for behavioral changes. But, he concedes, "it's hard for education to undo an effective marketing strategy, and that's what's going on here."
Countering the Tesla culture's early-adopter, beta-test-friendly mindset may require a technical backstop. Autopilot is supposed to monitor driver engagement using steering-wheel torque.
Other automakers use inward-facing cameras to monitor drivers, to ensure their eyes and attention are focused on the road. These systems issue warnings when those parameters are not met, and ultimately the driver-assist systems disconnect after repeated breaches.
Yet after the latest crash, Consumer Reports took a Model Y to a proving ground and found Autopilot could be "easily tricked" into driving with no one in the driver's seat.
"The fact Tesla Autopilot could be used when no driver is in the driver seat is a searing indictment of how flawed Tesla's driver monitoring system is," William Wallace, Consumer Reports' manager of safety policy, told Automotive News.
"We've expressed concerns for a long, long time. … What the new demonstration showcases is that Tesla's so-called safeguards not only failed to make sure a driver is paying attention, but couldn't tell if a driver was there at all. To us, that only underscores the terrible deficiencies that exist in the system Tesla is using to verify driver engagement."
Automation complacency, already linked to at least three fatal Tesla crashes by federal investigators, is one thing; the complete absence of a driver is another.
Ensuring adequate driver monitoring could be a straightforward answer. Fixing a culture that encourages egregious driving behavior? That's a more vexing matter.