We got some insight into the interaction between a driver and Tesla's Autopilot system this week when the National Transportation Safety Board released its factual findings from a fatal crash in Florida in May 2016.
The system was designed to sense when a driver did not have his or her hands on the wheel. It would flash a warning on the dashboard and then, if the driver did not respond, would give an audible warning for the driver to take over control. The NTSB report said driver Joshua Brown had his hands off the wheel for 37 minutes and received six warnings to take back over. Which he did. By touching the steering wheel for one to three seconds each time and then going back to doing whatever he was doing. (Which was not, incidentally, watching Harry Potter on a portable DVD player.)
Humans have shown a great propensity to try to cheat the system, whatever that system might be. (See: Enron, Bernie Madoff, Martin Shkreli and many others.) And so it is incumbent on automakers and autonomous vehicle developers to be as creative and imaginative as possible when considering all the ways humans will try to outsmart their cars. And when those shortcuts, cheats and hacks are discovered, to remedy them as quickly as possible. Fortuitously, U.S. liability law is written in such a way that companies cannot be penalized for fixing problems once they arise. But companies can be found liable for ignoring problems once those issues become evident.
Tesla updated the Autopilot system soon after the Brown crash, making it so that if drivers are warned three times to take back over, Autopilot turns off and cannot be restarted until the car is turned off and back on again.
This week, Chris Lattner, head of Tesla's autonomous program, left the company after just six months on the job. In his resume, posted online, he said Tesla's advantage in autonomous driving is that it has "tens of thousands of cars already on the road."
There has been a push for more simulated autonomous vehicle testing, which can be handy in speeding up the development of self-driving cars and lowering the cost of development. But without real-world testing with actual, flawed, hubris-headed drivers, it will be impossible to predict the ways in which people will try to outsmart the systems.
Even though companies will likely have to deal with sticky situations similar to the Brown crash, overall, it will benefit self-driving cars to learn from those incidents and make continuous improvements.
-- Sharon Silke Carty