SAN FRANCISCO -- No matter how far Google Inc. comes with self-driving cars, the technology will never be perfect. Human error and a chaotic world will not allow it.
This may be why Google, prodded by a report by the Associated Press about accidents involving the company's self-driving cars, revealed this month that its fleet has been involved in 11 minor accidents over the course of 1.7 million miles of testing.
Google insisted its software wasn't at fault in any of the crashes, yet the data prompted a search for meaning, as analysts compared Google's record to U.S. averages and asked whether the self-driving car had a safety problem.
It was a fundamentally misguided analysis. Google's research takes place mostly on urban roads, where minor accidents are more common than on highways. Google must also disclose every accident involving its self-driving cars to the State of California, unlike the legions of drivers who don't report minor fender benders for fear of raising their insurance rates.
Yet this probing, as unfair as it was, illustrated the bar that Google's self-driving car will need to get over to be accepted. It must fit onto the road seamlessly, so drivers sharing the road aren't surprised or put at risk by its manners. It must be not only safer than human drivers, but unquestionably safe.
The blog post that Google released this month explained a handful of fender benders, but it could very well have been about the first person to be seriously hurt or killed by a self-diving car.
"Even when our software and sensors can detect a sticky situation and take action earlier and faster than an alert human driver, sometimes we won't be able to overcome the realities of speed and distance," Chris Urmson, the director of Google's self-driving cars project, wrote in a post on the website Medium. This is "important context for communities with self-driving cars on their streets," he added. "Although we wish we could avoid all accidents, some will be unavoidable."
People are wary of new technology and eager to seize on its flaws. Think back to last year, when a few incidents surfaced of battery fires in the Tesla Model S. People were alarmed, and bafflingly so, because gasoline fires take place every day. Yet this fear had a material value: Tesla ultimately spent millions of dollars to retrofit its cars with titanium shields and defuse the controversy.
Google will find itself in the same position someday. If its cars drive enough miles, a tragic, one-in-a-million event will occur. So now, as it prepares to run pilot programs on the public roads of its hometown, Mountain View, Calif., Google must think beyond engineering -- about culture, psychology and marketing.
For a strange new technology seen as somehow intimidating, the cure is familiarity. To succeed, self-driving cars must be not just technically better, but also welcomed, which is why Google gave its prototype a rounded, friendly look. It's also why Google tests its cars in Mountain View. That's where familiarity will form fastest. This affinity must be so strong that it cannot be broken when something goes wrong.
The campaign for the hearts and minds of humans begins now.