As auto accidents go, it wasn't much: Twelve minutes before noon on a cool June day, a Chevrolet Bolt was rear ended as it crawled from a stop light in downtown San Francisco.
What made this fender bender noteworthy was the Bolt's driver: a computer.
In California, where companies such as Cruise Automation Inc. and Waymo are ramping up testing of self-driving cars, human drivers keep running into them in low-speed fender benders. The run-ins highlight an emerging culture clash between humans who often treat traffic laws as guidelines and autonomous cars that refuse to roll through a stop sign or exceed the speed limit.
"They don't drive like people. They drive like robots," said Mike Ramsey, an analyst at Gartner Inc. who specializes in advanced automotive technologies. "They're odd and that's why they get hit."
Companies are now testing autonomous vehicles from Phoenix to Pittsburgh and developers are closely watching how they interact with their human-driven counterparts as they prepare for a future in which they will be sharing the road.
What they've found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law -- unlike humans.
Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, CEO of self-driving software developer NuTonomy Inc.
"If the cars drive in a way that's really distinct from the way that every other motorist on the road is driving, there will be in the worst case accidents and in the best case frustration," he said. "What that's going to lead to is a lower likelihood that the public is going to accept the technology."