With his tweets about flamethrowers, tunnels and a plan to build a "cyborg dragon," Elon Musk has shifted into ludicrous mode. Meanwhile, his car company may be putting people in danger.
Tesla's Autopilot, like similar driver-assist features from other automakers, has great potential. But there's a reason the others don't roll out such technology without the kind of exhaustive testing that Tesla is counting on the general public to conduct.
A recent crash in Utah shows what's wrong with Tesla's strategy. A 28-year-old driver says she had Autopilot engaged and admits looking at her phone when her Model S rear-ended a firetruck waiting at a red light.
Because Autopilot was involved, the incident has naturally attracted attention, even though the only injuries were a broken ankle for the Tesla driver and possible whiplash for the firetruck driver. Musk responded in what has become his usual fashion, blaming the media for being unfair and riling up his cult of defenders.
"It's super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage," Musk tweeted.
He continued: "What's actually amazing about this accident is that a Model S hit a fire truck at 60 mph and the driver only broke an ankle. An impact at that speed usually results in severe injury or death."
The majority of the other crashes aren't national news because they don't involve a new technology promoted as saving lives. Beyond that, Musk's focus on the fact that the driver survived relatively unscathed misses the bigger, more important picture.
What if it hadn't been a sturdy firetruck in front of the car? What if it had been you, in a vehicle that wouldn't absorb such a violent impact so well? And if Autopilot can't detect a firetruck, who's to say it could see you?
Musk's statements, his clash with the National Transportation Safety Board over the investigation of Autopilot's role in a fatal Model X crash in California, and even the decision to use the name Autopilot, suggest that he's concerned only with his own customers. What about the rest of us who share the road with Tesla buyers — some of whom seem to think they can stop paying attention because the car will handle the job of not crashing into things?
Vehicles have to be designed not only to protect their occupants, but also so they don't endanger others on the road. Musk has acknowledged that Autopilot "needs to be better" and said Tesla works "to improve it every day." But he said technology that, "on balance, saves lives & reduces injuries should be released."
Recently in Arizona, a software error made an Uber self-driving Volvo decide not to stop for a woman walking her bike across a road. Elaine Herzberg didn't agree to Uber's testing, and I'm sure her family isn't comforted by the idea that autonomous vehicles can, "on balance," kill fewer pedestrians.
Musk might be right that Autopilot can help reduce the appalling number of deaths and injuries that happen on U.S. roads every year, and there's good evidence that Tesla's cars generally protect their occupants well when crashes do occur.
But statistics become meaningless if the last thing someone sees is a Tesla coming up fast in the rearview mirror.