The self-driving car is supposed to lead to a world without accidents, but it is achieving the exact opposite right now: According to Bloomberg, autonomous vehicles have racked up a crash rate double that of those with human drivers.
The glitch? They obey the law without exception. That may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit.
As accidents have piled up -- all minor scrape-ups -- arguments among programmers at places such as Google and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?
Last year, Raj Rajkumar of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh offered test drives to members of Congress in his lab's self-driving Cadillac SRX crossover. The Caddy performed perfectly, except when it had to merge onto Interstate 395 South and swing across three lanes of traffic in 150 yards to head toward the Pentagon. The car's cameras and laser sensors detected traffic in a 360-degree view but didn't know how to trust that drivers would make room in the ceaseless flow, so the human minder had to take control to complete the maneuver.
"We end up being cautious," Rajkumar said. "We don't want to get into an accident because that would be front-page news."