Two summers ago, Audi stood on the precipice of an automated-driving breakthrough. Its redesigned A8 sedans contained a system called Traffic Jam Pilot, which, when active, relieved human drivers of the need to pay attention during the tedium of stop-and-go traffic.
Why Level 3 automated technology has failed to take hold
On the level
Not all automated vehicles are created equal. The industry has adopted these 6 levels of driving automation, as outlined by SAE International.
No driving automation: The human drives. The vehicle may have features such as automatic emergency braking or blind-spot warning, but the human remains in control of the dynamic driving task.
Driver assistance: The human drives. The vehicle provides steering or brake/acceleration support, such as lane centering or adaptive cruise control.
Partial driving automation: The human is responsible for vehicle operations. The vehicle provides both steering and brake/acceleration support.
Conditional automation: The human and robot can exchange responsibility for the driving task. The vehicle can drive itself under limited conditions, but the human must take over on request.
High driving automation: The robot does all the driving, but with certain restrictions, such as robotaxis that are limited to operating in a geofenced area.
Full driving automation: The robot does all the driving, everywhere, and in all conditions.
Trumpeted as a defining moment on the road to full autonomy, the system was the first in production that allowed humans to hand driving responsibility to the car itself for some portion of the journey — so long as a human remained available as a backup. In industry jargon, such a system is classified as Level 3 automation.
But the milestone arrived with an asterisk.
Audi equipped the A8 with all the components necessary to make Traffic Jam Pilot work, but it hadn't actually enabled the feature. Activation would come, brand executives theorized at the time, via over-the-air updates as regulatory compliance was ensured market by market.
Today, Traffic Jam Pilot remains dormant in the U.S. Audi has no foreseeable plans to activate the system, and the future of Level 3 automation for Audi and everyone else remains beset by a morass of regulatory, technical, safety, behavioral, legal and business-related complications.
Once, Level 3, sometimes called "conditional automation," seemed like a natural step in the evolution of automated technology, a progression beyond today's driver-assist systems in which drivers retain responsibility for vehicle operations and a precursor to self-driving vehicles. But that building-block assumption baked into the Levels of Driving Automation, outlined by SAE International, belied unique challenges.
"Some OEMs say they're going to skip Level 3 altogether and do Level 4, while others are still pursuing this possibility," says Christophe Marnat, executive vice president of the electronics and advanced driver-assist systems division at ZF. "I'm not sure how to interpret that. Does that mean they have solutions? … Or is it just research that never comes to fruition? It's questionable."
Part of the problem is the delineation of responsibility.
With Tesla's Autopilot and General Motors' Super Cruise, both Level 2 systems in the market today, that's clear. Regardless of functionality, humans remain responsible for all driving operations. In Level 4 self-driving systems, responsibility is also clear: Human occupants have zero role.
Consider Level 3 a middle ground, where responsibility can be exchanged between human and machine. When systems are in control, humans are still required in case the system encounters a situation it cannot handle.
Those handoffs, and the notion of a human backup, bring forth a series of questions and challenges:
While a system drives, can a human backup check email or watch a video? How do manufacturers ensure there's no mode confusion? What series of audio, visual and haptic cues should be used to alert a human driver they need to retake control? How long should they be given to do so? How should vehicles monitor the readiness of humans to accept a handoff? What happens if they do not accept that exchange?
"Level 3 pushes the boundaries on what you expect the human to do, and it makes it difficult to discern 'Am I driving or riding?' on a moment-to-moment basis," says Bryan Reimer, research scientist at the Center for Transportation and Logistics at the Massachusetts Institute of Technology. "If I am a human driver, how do I learn to be a fallback?"
An early experiment provided unsettling answers.
In the earliest days of its self-driving car project, Google experimented with systems that involved a handoff. Test drivers immediately over-trusted the technology to the point where, in some cases, they fell asleep behind the wheel. Those findings alarmed engineers, making apparent the hurdles of any systems in which humans remained in the loop. For Google, the implications were clear: It was easier to build a self-driving system from scratch than consider keeping human drivers involved.
Among traditional automakers, Ford and Volvo came to similar conclusions and have not pursued development of Level 3 systems. Others see conditional automation as valuable, particularly for traditional car owners who might pay for convenience features, such as Traffic Jam Pilot, that relieve their workload behind the wheel. Honda and Mercedes-Benz plan deployments of Level 3 systems in 2020.
Implementing Level 3 will necessitate a rethinking of user interfaces already developed for driver-assist systems and a more collaborative approach to sharing the driving task, says Artur Seidel, vice president of the Americas at Elektrobit, a supplier of software products and human-machine interface technology.
An example: If it starts raining while a Level 3 system is engaged, the system could alert its human backup that conditions are becoming more demanding.
"Then it starts to rain a little harder, and the car says, 'I think you should take over soon,' and then the next stage is the takeover," Seidel said. "It's a gradual transition, and the more we can make the system predictable without overloading the driver, it serves as a training effect."
He underscored the extent of that complexity.
"Because of the liability involved in that handover," Seidel says, "there's quite a bit more work to do on Level 3 systems. You end up in a situation where you have to — under all circumstances — handle a safe stop. We're not necessarily there."
Driver-monitoring systems have emerged as a crucial component of any system involving human drivers. By monitoring hand position on a steering wheel or tracking eye movements with in-cabin cameras, systems can ensure a human is paying attention. But they have limitations.
"It's one thing to say you're looking out the windshield and another to say your head is in the game," says Chris Van Dan Elzen, vice president of product planning at Veoneer, a software and hardware company that spun off from supplier Autoliv and focuses on driver-assist and automated-driving systems.
Determining how to measure the cognitive readiness of humans to retake responsibility has been the subject of new research. But there is neither consensus nor regulation on how long drivers should be given to accept a handoff. For Traffic Jam Pilot, Audi had estimated drivers would be ready in 10 seconds. Others believe the time needed is considerably longer.
"The driver shouldn't need to be reactivated faster than a couple of minutes," Volvo CEO Hakan Samuelsson said in November. "Otherwise, it's a very dangerous system. If you cannot do that, you have a pilot-assist system like you have today, which requires total supervision at all times by the driver."
If driver-monitoring systems have become necessary to enable Level 3, they also represent something automakers prefer to avoid — cost.
Because automakers accept liability when their systems are in charge, testing and validation costs will be more expensive, as will be additional hardware on vehicles in the form of computing power and redundancy of actuators. Audi added front-facing lidar as part of its Traffic Jam Pilot package; it remains unclear whether others will follow. Further, it remains unclear whether motorists will value these systems enough to bear the costs.
"That's a ton of stuff that's extremely expensive, and when you think about it, you still need to be aware of what's going on as an end consumer in this system," Marnat says. "It's expensive and limited in terms of usage. Maybe you see some consumer benefit, but there's a trade-off between how much you are willing to pay for it, knowing you are still in the loop."
Some consumers might be prohibited from using such features. In New York, for example, state law requires drivers to keep their hands on the wheel at all times, a factor in Audi's decision to delay the release of Traffic Jam Pilot.
As these issues are sorted out, automakers are rolling out more and more advanced driver-assist features. Many in the industry have latched onto the unofficial term "Level 2 Plus," which conveys the idea that functionality of driver-assist systems is improving. At the same time, automakers are stopping short of accepting liability associated with Level 3.
"You feel the industry moving toward this Level 2 Plus or 2 Plus Plus or 2 Plus Plus Plus," Marnat laughs. "What is the point where you stop being Level 2 Plus?"
Combine consumer acceptance with cost, uncertain regulatory landscapes with the thorny nature of control exchanges and driver monitoring, and Level 3 remains an elusive goal.