A high-profile fatality in a Tesla Model S operated in Autopilot mode reveals that the push toward self-driving cars still has many unsolved problems.
The biggest issue: Automakers and researchers are grappling with the human aspect of self-driving cars. When should drivers take over? Will they have enough time to react if they aren’t paying attention to the road? Or should the car do all the driving, all the time?
“How does the driver know what is their responsibility and what is the vehicle’s responsibility?” said Jim Sayer, director of the University of Michigan Transportation Research Institute. If a car’s computer system can’t detect that there’s a problem ahead on the road, it wouldn’t even know to alert the driver, he said.
Carmakers are trying to understand how humans behave when they’re retaking control of a vehicle that has been driving in autonomous mode. Audi is researching that topic in cooperation with the Virginia Tech Transportation Institute. One aim of the study is to figure out what people do when they’re not driving and how fast they can take over, said Brad Stertz, spokesman for Audi of America.
So far, it appears the worst-case scenario is when people are daydreaming. “That’s slowest,” Stertz said. “The next slowest is when they’re absorbed in some sort of device like a smartphone or a book.”
The Tesla debate also raises the question of whether autonomous driving and traditional manual driving should coexist in the same car. Google already decided they cannot, and it committed to vehicles that have no steering wheels or pedals.