Op Ed by editor
Self-driving cars are all over the news today and it’s no wonder, they’re crashing all over the place. Well it would seem like it from the news coverage. Of course the novelty of the technology makes it newsworthy so it’s going to get over reported. While I’ve never seen the statistics of self-driving cars in accidents vs human operated vehicles so it’s hard to say how they actually compare, my question is why are self-driving cars such a “thing” to begin with?
I don’t get the hype. So here’s my reasons they won’t catch on for at least another five years, and then only in certain markets.
1) The Fahrvergnügen Factor (People like driving)
Look it’s fun to drive. Millennials seem less excited about owning a car as a status symbol because it’s not a means of staying connected with social groups…they have cell phones for that. None the less, there’s still a few generations that came before, mine included, that simply like to drive. A couple generations before mine actually went on Sunday drives with kids in the car. For a writer/editor like me it’s the second most creative time I get in the day (2nd to taking a shower). I don’t see us running out of people who love to drive quickly unless there’s some kind of cultural mega shift or legislative mandate.
2) The Control Conundrum (People are control freaks)
There are not a small number of people who don’t like to be passengers if they can help it. Maybe they’ll make an exception to ride in back of a limo, but otherwise they’d rather be behind the wheel and in control. I drive my family 588 miles each way to visit in-laws for holidays and I don’t even use cruise control. I want to feel the road, the feedback on the controls. I don’t like it when I see something coming long before the person driving. Especially if it’s my wife. My wife prefers that I drive, so that helps.
3) A solution in search of problem
I don’t know anyone who hates driving and if they do its cheaper to take the public transit. Frankly most people buy a vehicle because they find owning their own to be convenient. Those people don’t mind driving themselves. If you’re going to go to the expense of owning your own vehicle, learning how to operate it for yourself isn’t a big burden. To be fair I understand that having a kitchen doesn’t stop you from eating out, but get real, there’s a time/effort/quality difference between cooking my own dinner and what I’d get from a professional. If I drive myself or my car drives me, it takes roughly the same time and effort.
4) DWI (Driving While an Idiot)
The elephant in the room is impaired driving. Lot’s of people drive under the influence of some kind of chemical. More drive when tired or distracted. Even otherwise safe drivers can have a bad day. Wouldn’t it be better to let a robot drive you home after you get fired or your wife serves you divorce papers, or April 15th …or lets face it Wednesday? I can’t be the only one angry at the idiots on hump day. Well let’s face it, if you shouldn’t be operating your vehicle there’s already options. Taxi’s, ride share companies, mass transit, and so on exist. If you have enough judgement to recognize that you shouldn’t drive you aren’t without options. Get a hotel room. Call a friend. I maintain that very few people would be more likely to use a robot than a rideshare service even though it’s at their fingertips.
5) The Cost Benefit Analysis
It’s not safe unless everyone does it for all the reasons above. As we see with the BostonGlobe.com article title, “Uber resumes self-driving car program after brief suspension,” or “Tesla ‘autopilot’ car hits Phoenix police motorcycle” by Megan Cassidy of the Arizona Republic, developers are quick to point out that it wasn’t their robots fault. Oh good, as long as the robot didn’t cause the accident I don’t mind the whiplash so much.
I think we’re further away from self-driving cars than the experts like to think. I think programmers underestimate some of the subtleties of simple tasks more often than not. Look at walking androids. They’ve come a long way recently, but it’s taken decades to conquer the simple act of balancing on two feet—something toddlers do pretty well in a few months. Why do you have to get uninsured motorist insurance when by law, everyone must have insurance? Because sometimes people don’t do what they’re supposed to do. Even if your robot car can navigate the map from point A to point B without striking an object or running a red light can it truly avoid being crashed into? Well naturally that’s part of the testing and a goal of every developer, but some accidents are unavoidable. What will the car do when fatal impact is inevitable? More to the point, who’s going to buy it when they find out the robot will let you die if it saves more people in the other car?
6) The Paranoid Reasoning
If I get another email requiring me to select a stronger password because a company got hacked I’ll go out of my gourd. Especially when they’ve convinced me that it’ll be convenient for me to store my credit card information on their “secure” server. There’s no such thing as unhackable. How long after we have self-driving cars before terrorists start uploading viruses to cause 500 car pile ups in a major city? How long after that before law enforcement simply locks the doors shut on your car and remotely drives it to a police station when they want to talk to you? I know it’s paranoid, but it’s a different world than the one I grew up in. Let’s face it, the fact that there’s people more willing to trust technology than a random person tells us that we’ve turned a corner in our ability to accept that the real world comes with an acceptable element of risk.
We’ll I think we’re still at least five years away from achieving the technology and at least that long from it becoming socially desirable. People like me will thump our tubs and proclaim that the amount of personal autonomy we have to sacrifice to gain a tiny margin of safety just isn’t a fair trade. What do you think? Please comment below.