Nvidia's Drive PX automated driving platform is not like other car parts.
It seeks to replace the brain of a human driver. And to do that, it must be constantly improving itself, adding to what it knows and remembering how to do its job better.
This is not the stuff of mufflers and windshield wipers. In an era when technology suppliers are determined to enable vehicles to operate by themselves — follow roads by themselves, no matter what their condition, and anticipate traffic by themselves, no matter how it moves or surprises — there is one car part of the future that has virtually no similarity to anything in the past, and that is the brain. It is what Nvidia and the auto industry call a driving platform.
"You can never have enough computing horsepower," said Danny Shapiro, senior director of automotive at Nvidia Corp., a Silicon Valley computer processing giant with a rapidly growing supplier stake in the world auto industry . "It just gets more and more sophisticated."
As a company, Nvidia got its start developing graphics processing units for video games in the 1990s. As an auto supplier, it began working on an automotive platform only a decade ago. The latest iteration of its product, scheduled to hit the market at the end of 2017, will use its Xavier processor to handle 30 trillion operations a second — up from 24 trillion operations a second in the Drive PX 2 platform.
That ability to manage a gargantuan volume of impulses has landed the supplier a spot in the self-driving cars that are planned by Toyota Motor Corp., Volvo Cars, Audi and Tesla Inc., as well as in the automated driving platforms in development at Robert Bosch and ZF.
The current Drive PX 2 platform powers Tesla's semiautonomous Autopilot driving system. The upcoming Xavier platform will help enable Level 4 autonomy — the industry designation for future vehicles that require no human interaction in defined conditions.
Developing an automotive-grade platform to power a self-driving car, and turning it into a mass-producible component that can be delivered to auto plants for daily assembly within the industry's normal parameters of cost, time and quality, is no simple feat, even for Silicon Valley.
To achieve this level of sophistication, Nvidia has leveraged artificial intelligence technology and expertise, as well as billions of dollars and engineering resources it has built up through its offerings in other industries.
The vehicle technology began identifying the critical roles played by the human in the driver's seat. The platform would have to be able to see everything a person sees, and then to process everything the human drivers understand about what they are seeing. While a combination of camera, radar and lidar sensors can replace the human eye, and in some cases detect even more than the human eye, Nvidia's platform must be able to also think about what it sees.
Such a vehicle technology would have seemed improbable a decade ago. Shapiro says developing it comes down to software.
"It all comes down to math," Shapiro said. "Everything is just doing calculations, using all different kinds of algorithms."
From perceiving the colors of stop lights to comprehending the difference between an ambulance and a delivery van, the Drive PX platform has to make sense of sensor data by translating pixels to numbers that can be plugged into algorithms. Rather than develop software for every single driving situation a car may experience, the system uses artificial intelligence algorithms that can handle more information using less code.
This simplification is achieved through neural networks — algorithms that are able to "learn" like a human brain. Engineers can train the platform to recognize objects and situations by running them through repeated images and simulations. Instead of having to program every single street sign in a city into the platform, it can learn to recognize them on its own.
"Once you run the algorithms, the platform can detect objects with incredible accuracy," Shapiro said. "We use hardware to process mathematical information, then decode the raw data into what it represents."
Sound easy?
Even Shapiro's reasonable and simple explanations soft-sell just what an enormous r&d endeavor the space-age technology has required. Nvidia has relied on thousands of engineers and an investment of $3 billion over the past few years. The results also required automotive-scale design and functionality. It could not be a big room of computers, and it could not end up as a fragile, hugely expensive computer.
The resulting "component" can withstand extreme temperatures without taking over significant space under the hood. The Drive PX 2 platform was slightly larger than a laptop computer, holding four processors. The upcoming Xavier platform is one-third that size, relying on only one processor and running on 30 watts of power.
The platform requires protective casing to stand up to an average 10 years of automotive wear and tear.
"Five years ago, when we first started working with Tesla, it was critical to have an automotive-grade product," Shapiro said. "We needed to ensure it would work in the desert on a hot summer day."
Reaching this point also required Nvidia itself to change. To become an automotive supplier 10 years ago, the company reworked its manufacturing processes to meet auto industry standards.
Nvidia has developed an ISO-certified production process, and its automotive products can withstand temperatures between minus-40 degrees and 221 degrees Fahrenheit.
Its testing process is now modeled after auto industry practices, Shapiro said.
Though a relative newcomer to the industry, Nvidia has been able to attract a wide range of partners with its automated driving hardware.
At its GPU Technology Conference in May, Shapiro said Nvidia has been working with 225 companies in the automotive industry.
Bosch announced its partnership with Nvidia in March, marking the first venture to use the new Xavier platform to power an automated-driving supercomputer.
"Nvidia is one option we're working with, to utilize their expertise specifically in GPUs," said Kay Stepper, vice president of automated driving for Bosch. "They have an advantage of performing these types of image analyses and extracting meaningful results."
He added that the processing power of the Xavier platform solves issues of speed and accuracy when working on image perception, though it's still unclear just how much processing power will be the standard for production-level self- driving cars.
"This is actually still a very active debate," Stepper said. "It's still a topic of very intense research for us."
Since the Bosch announcement, both Toyota and Volvo have said they would use Nvidia's platform in Level 4 self-driving vehicles. The 2018 Audi A8, introduced in July, will use the Drive PX platform for Level 3 autonomous driving functions — which still require human interaction.
Shapiro said none of Nvidia's partnerships is exclusive and the company is open to working with just about anyone in any sort of capacity. And Nvidia's partners are banking on a platform that will continue to get better and better.
"We're seeing more and more sensors and high-resolution cameras being added to cars, it's taking up more processing," Shapiro said. "Our customers keep saying they want even more."