The real problem with driverless cars: human drivers
Self-driving cars obey traffic laws, confusing human drivers who often don’t
There’s a big problem with driverless cars: human drivers. Google’s fleet of autonomous vehicles has been involved in accidents at twice the normal rate, all of them technically the fault of human drivers.
The self-driving cars that Google and others have been testing on public streets keep getting rear-ended, apparently because they’re too law-abiding and too careful.
The cars, after all, are programmed to obey all traffic laws. When they come to a stop sign, they stop. If a bicyclist is taking up part of a lane, they don’t swerve across the double line to go around, they slow down or stop. If a pedestrian looks like he might be about to cross the street, the car stops.
All of this may help explain why none of the cars have been involved in an accident involving injuries or fatalities. But it is presenting programmers with the need to develop algorithims that are a little more flexible than the ones that Google uses to look up the date of the Norman Invasion or other clear-cut factlets.
That’s partly behind the reasoning the California DMV is using to map out new regulations for driverless cars. Basically, it wants them equipped with a licensed human who can take charge when the software runs out of options or makes a choice that is logical but may not be ideal.
Google has decried the DMV’s proposal as a wrong turn for the autonomous vehicle movement, but the DMV says its first responsibility is to the public, and it’s not yet ready to abandoned human ingenuity for rote software.
Google says it’s working to make its cars react more like humans, making them a bit more aggressive without being reckless.
As for California’s proposed regulations, top federal regulators say they’re concerned at the possibility that different states will develop a “patchwork” of laws that would hinder a nationwide rollout of self-driving cars.
“Nimble, flexible …”
The National Highway Traffic Safety Administration (NHTSA) doesn’t yet have a position on California’s proposal that every car come equipped with a human driver, said Mark Rosekind, the agency’s administrator.
He said his agency favors a “nimble, flexible” approach to writing rules for driverless cars. States, of course, have long set their own rules about licensing and registering cars and drivers, so it is going to require some flexibility and nimble footwork by Rosekind’s agency if it intends to impose a single standard nationwide.
Meanwhile, consumers are coming up with their own proposals. An Automotive News reader, Jerry Segers, had a simple suggestion for the problem of self-driving cars being rear-ended:
“Perhaps all that is needed is a sign on the rear and [sic] of the car that reads ‘Driverless Car’ much like there are ‘Student Driver’ signs on driver training cars. This would put the public on notice that this car will obey the law much like a student driver. This would increase the caution of other drivers until the time when most cars are driverless,” Segers wrote.