Well, it was bound to happen at some point: A driver is suing Tesla after she crashed into a firetruck, saying the Autopilot mode failed to behave the way salespeople told her it would.
Heather Lommatzsch was driving in Autopilot mode in a Tesla Model Son May 11 on Bangerter Highway in Utah when she crashed into the back of a stopped firetruck. Lommatzsch claims in her lawsuit that Tesla salespeople told her the car would stop if it detected a stopped vehicle in front of it and encouraged her to touch the steering wheel intermittently to keep the Autopilot feature engaged.
Lommatzsch also said she tried to hit the brakes when she saw traffic stopped, but the brakes did not work. That contradicts a police report, which says Lommatzsch admitted to be looking at her cellphone when the crash occurred and that she touched the brakes a fraction of a second before the accident.
Tesla spokesman Dave Arnold told the Associated Press that the company "has always been clear that Autopilot doesn't make the car impervious to all accidents."
But automotive insiders have long questioned Tesla's use of the name Autopilot for what is essentially an advanced adaptive cruise control system. Many worry that by overselling the product's capabilities, Tesla is going to turn consumers against self-driving cars because of the negative publicity that comes along with crashes.
Ultimately, though, every automaker that participates in the self-driving world will face lawsuits from consumers, who are the wild X factor in this system. Weird things happen. And no amount of legal fine print will stop drivers from pushing the limits of these vehicles.
-- Sharon Silke Carty