The problems with autonomous vehicles in cities

When I hear about how autonomous vehicles are the panacea for urban traffic woes, I’ve been skeptical. With every apparent advance—like the recently reported hand signal patent—I come back to the fact that they apparently still don’t work in the rain (it hasn’t rained much in Mountain View recently) and certainly not snow. And they’ve driven 700,000 miles, but mostly on the same subset of suburban roads that have been meticulously mapped, not new and varied roads across a variety of terrain and regions.

But I see the main issue is that autonomous vehicles will be unable to interact with people in the same way that people driving cars can. There’s a turn in Central Square in Cambridge from Mass Ave to Pearl Street (I happen to live a few blocks away) that’s a good example. It is right by a subway entrance and major bus stop, so there are tons of pedestrians. It has a couple of crosswalks but no signals. To make the turn, you often have to wait until pedestrians “screen” oncoming traffic, worm your way left, and find a gap in the pedestrian traffic to slowly get on to Pearl Street. Everyone is okay with this if it’s done at a slow speed—if you waited until conditions were perfect, you’d never make the turn. But by the letter of the law, none of this is legal. To make the left, you would have to wait until there is no oncoming traffic (rare), whether there are people in the crosswalk or not. Then to cross the crosswalk, you have to wait until there are no pedestrians within 10 feet of the crosswalk, a situation which might occur here once every 45 minutes. (It should be noted that crosswalk laws vary slightly from state to state, so rules would have to be coded differently for each jurisdiction.)

People driving cars can get through this intersection. But an autonomous car can’t be coded to break laws. If (more likely, when) it was involved in an accident, the code would be subject to discovery, and examined at length. If anything was found that permitted the car to break a traffic law, it would be legal ammunition to go after the deep-pocketed developers. In cities, everyone fudges the law a bit for things to work. Pedestrians step off near-if-not-in the crosswalk. Cyclists nudge over the line and leave a light a few seconds early. Cars slowly make turns when they don’t have right of way and pedestrians break stride so the cars can get through. It’s not perfect, but works. Sort of.

So, take a look at the picture to the right. In the middle, note that there is a person waiting at a crosswalk. An autonomous vehicle would see this person and stop to let them cross the street. But they would just stand there. The person might flick their hand, but the car won’t easily be able to interpret that as “oh, I’m waiting, go ahead.” Which puts the autonomous vehicle—or its coders—in a conundrum. If you stop for a pedestrian but they don’t cross, how long do you wait? 5 seconds? 10 seconds? A minute? Can you really have any rule there that would allow you to break the letter of the law and cross a crosswalk with a waiting pedestrian? And how often will an autonomous vehicle come to a complete stop because a person is near a crosswalk, even if they aren’t crossing? Will there have to be a manual override? However will that work?

Yet everyone driving a conventional car was able to quickly and easily tell that this woman was not crossing the street. Every so often she would wave her arm. An autonomous vehicle would have little ability to discern this movement. Waving at a friend? Waving at a person? Waving cars ahead? Waving because she was just released from an underfunded mental institution and waves at everyone? It turns out, the waves were pretty well correlated with the passage of taxicabs. Aha, she was hailing a taxi! This is something that every driver was able to intuit immediately, and no one stopped for her at the crosswalk. But an autonomous vehicle would be stuck. For how long? Who knows. Imagine a cab traversing 5th Avenue in Midtown Manhattan? It would wind up screeching to a halt every block as a New Yorker flung out a hand to try to nab an empty cab.

Sure, there might be fixes to this specific problem. Maybe Uber will render taxis a thing of the past. (But then won’t people waiting on the curb for an Uber create the same type of problem?) Maybe every taxi will be fitted with a transponder so that an autonomous car can see a person waving and correlate it to the location of a taxicab and make the connection that they’re hailing a cab. (Which seems to be a complex solution to a simple problem.) Maybe people will learn to hail cabs only away from crosswalks (fat chance). And even if this problem is somehow solved, there are thousands of others like it. Driving a car in a busy city has infinitely more scenarios than on a controlled access highway. You can control for one outcome, but there are thousands of others that may pop up.

And many of these will be one-off scenarios. The aforementioned intersection at Pearl Street is probably unlike most any other in the world. Cyclists are not required by law to use hand signals. A car may not signal, but may shade to the right, something some motorists (and many cyclists) know means they’re probably going to turn. To get around a double-parked car, you may have to cross the double-yellow line. Human drivers can pretty easily assess the scenario and risks involved and decide whether to operate outside the letter of the law. But an autonomous vehicle, in any of these situations, may find itself stuck. The whole selling point of the self-driving car is that it will be able to operate without a human driver. But in cities, it seems that there would have to be frequent manual overrides when the car finds itself between a rock and a hard place: where it can’t go forwards without breaking a rule.

People are good at breaking rules—and in city driving, bending the rules is frequently a necessity. Computers—by design—are not.

Autopilot and self-driving cars

Every now and again I come upon a self-driving car (or fully automated vehicle, or FAV) puff piece. Here’s Robin Chase—of the Cambridgeport Chases—talking about how we’ll have fully automated vehicles zipping around and shuttling us places in a matter of years. Here’s the AtlanticCities talking about the steps that will be taken to give us FAVs in a very matter-of-fact way. Look, Google has one, right here! And Bridj (a press-grabbing start-up about which I am a tad skeptical—that’s for another day) predicts we’ll have shared, automated vans carting us around in no time. Won’t it be great when we’re all driving FAVs?

I’m skeptical. The main premise of four wheels and an internal combustion engine is 110 years old and hasn’t really changed. But technology is great, right? Surely it will solve this problem. Planes have autopilot, so why can’t cars.

Because planes have pilots, too. Jim Fallows had a piece recently about “what autopilot can’t do.” Autopilot doesn’t land planes, and in very adverse conditions, it can not compensate for things like gusts of winds from the side. The video he shows has pilots expertly guiding planes in at an angle, then straightening out just in time to touch down and not shear the wheels off. This is not something a computer program can do, because it requires a soft touch on the rudder—and knowledge that can’t really be programmed. 99% of the time, autopilot is great. But 99% of the time isn’t enough.

For FAVs, the same issue will arise. Sure, 99% of the time, a fully automated vehicle will do fine. And, yes, perhaps there will come a time when FAVs are allowed to be operated on certain limited-access stretches of roadway. But while it’s easy (well, not easy, but it’s relative) to code a car for the same few situations that come up over and over again, it’s a bit harder to solve the myriad issues that account for the other 1%, such as:

  • a patch of ice
  • a piece of debris
  • a bicyclist darting out between two cars
  • a pedestrian jaywalking
  • the proverbial child running after the bouncing ball
Autopilot for airplanes is designed for the easy-to-solve portions of a trip: the set-it-and-forget-it parts. But it still requires two pilots to keep an eye on things. And that’s on airplanes with triple redundancy built in everywhere. As long as cars ply public roadways and run on an ICE, they’ll need to have a human paying attention to the road ahead.

Most situations can be automated and controlled for. But driving—especially in cities—is not easily automated given a myriad of uncontrollable factors. For those situations, the human brain still is, by far, the most effective tool.