When I was on vacation a couple weeks ago, I lost my wallet. Oddly, the experience has made me uneasy about autonomous vehicles.
I was in Colorado at the time, and it struck me that it sure would be nice to have a driver’s license for the 1,500-mile drive back to Michigan. It was Sunday, so I went on the Michigan state website to try to get a temporary license that I could print out.
It seemed promising at first. There was a process in place -- apparently lots of people lose their licenses. Then the program told me I had to renew my license, although it would not expire for 11 months.
OK, I thought, do what the nice computer says. After navigating the usual picky screens and entering the necessary information, I thought I was set. But at the last step, the system told me that, no, I couldn’t renew my license online. I was trapped in one of those software cul-de-sacs from which no human escapes.
After a few more tries, I gave up, muttering foul imprecations.
But the next morning was Monday. Perhaps I could still get something printable before we hit the road. I called the Michigan Secretary of State office and, after evading phone-mail jail, got a human being on the line.
I explained my predicament. The woman to whom I was talking looked at my file and, sounding puzzled, said, “I don’t know why it would tell you that you had to renew.”
Then she created a temporary license and emailed it to me for printing. It was a simple, efficient, cheerful interaction.
“Wow,” I said, “It’s so nice to deal with a human being instead of software.”
“You know it,” she said.
And that, I think, gets at a fundamental difficulty with self-driving vehicles.
Autonomous-vehicle advocates cite statistics on the overwhelmingly high percentage of serious accidents caused by driver error. Their solution is to take vehicle control away from the human driver. Hand it off to the technology that never gets drunk, never falls asleep, never chatters on the phone.
I don’t mean to downplay the seriousness of the quest for cars that don’t crash. More than 30,000 people die in traffic accidents in this country each year, and many more are seriously injured. Obviously, safety takes priority.
But letting technology drive our cars is a more nuanced issue than it may initially seem. At least at the current level of development, we’re caught between two conflicting realities.
The first is that, as any computer user can tell you, technology doesn’t always function flawlessly.
Software has bugs. Hardware malfunctions. In the case of autonomous vehicles, incoming information about road conditions and traffic can be distorted by snow, fog, wet pavement and the like.
The second reality, as my driver’s license interaction illustrates, is that an awake, alert and competent human being is a pretty awesome information-processing and decision-making unit.
That’s the dilemma that developers of self-driving vehicles face. Technology outperforms humans at their worst. But it underperforms humans at their best.