SAN JOSE, Calif. -- Audi's engineers designed "Jack," their latest self-driving car prototype, to deal with many hazards.
Tumbleweed wasn't one of them.
Jack, an A7 sedan guided by nearly two dozen lasers, cameras and sensors, was taking a 550-mile drive to the International CES technology expo in Las Vegas in January when, on a remote stretch of desert highway, a piece of tumbleweed got stuck to the grille.
It stayed there for 10 or 15 miles, blocking some of Jack's sensors, Daniel Lipinski, a senior engineer at Audi, recalled last week at a conference here. Jack managed to stay on course, but the episode clearly showed one of the most daunting challenges in designing self-driving cars: teaching them to deal with the unexpected.
"We can't program a car to do every behavior," said Jen-Hsun Huang, CEO of chipmaker Nvidia Corp., which hosted the conference.
It's that challenge that has plunged Nvidia and other tech companies into the field of "deep learning," which aims to train computers to process data in a way that mimics the human brain.
Automakers and suppliers have historically used brute force in designing "driver assist" features that prevent cars from crashing. They run extensive tests to prepare for a wide range of situations, and if testing shows that an automatic braking system can't recognize a baby deer, for instance, engineers tweak the algorithm to create the right result.
Nvidia, whose clients include Audi and Tesla, aims to supplement those technologies with onboard computers that could process signals from cameras and sensors and quickly compare them with a vast and ever-growing database of known driving situations. Such computers could "learn the behavior of driving over time and can be updated over time to be smarter and smarter and better and better at driving," Huang said.
The company's first offering with that capability is its Drive PX automotive computer, which will be available in May with a $10,000 price tag for a developer kit. (A production chip would cost less.)
Some experts suggest that deep learning will replace algorithms and upend the field of suppliers working to refine autonomous driving through conventional programming. Others expect deep learning to supplement algorithms, not replace them.
Israel-based Mobileye has started shipping its EyeQ3 chips, widely used for automatic braking systems, with deep learning technology as well as algorithms, Barclays analyst Brian Johnson wrote in a note to investors this month. He dismissed the notion that deep learning would obviate Mobileye's portfolio of algorithms, designed over 15 years.
But deep learning is improving rapidly. Microsoft said Feb. 6 that its supercomputer was the first to beat a human in recognizing pictures from the ImageNet database, which includes 1.5 million images in 1,000 categories, from dogs to fruit.
With the right training, a car could soon be better than a human at recognizing a baby deer, tumbleweed and other objects. That's hugely important for automakers such as Audi and Tesla that are trying to relieve drivers from the burdens of driving.
"In a complex suburban environment, that's where you get a lot of unexpected things happening," Tesla CEO Elon Musk said at last week's conference. "A road closure, a manhole cover open. Children playing is a big issue ... being able to recognize what you're seeing and make the right decision in a suburban environment in that 10 mph to 50 mph zone is the challenging portion."
Tesla uses Nvidia's chips to power the instrument cluster and infotainment system in the Model S sedan, and Musk said advances in computing have left him confident about development of self-driving cars.
"I almost view it as a solved problem," he said. "We know exactly what to do, and we'll be there in a few years."