Self-Driving Cars May Need To Be Bad Drivers To Succeed

A self-driving car getting pulled over by traffic police sounds like the subject of an xkcd comic, but Google's autonomous run-in with the law shows the robots have a lot to learn. Getting stopped for driving too slowly amid other traffic might only be the tip of the iceberg, in fact, and it's entirely possible that autonomous vehicles will need to learn to be worse on the road in order to fit in.

Advertisement

As it turns out, Google's car managed to escape with a warning not a ticket – not specifically because it was driving 24 mph in a 35 mph zone, but because it was holding up traffic behind it, section 22400(a) of the vehicle code – at the judgement of the traffic cop that stopped it. All the same, it underscores the fact that coexisting for self-driving vehicles is going to be more complex than you might guess.

A large part of the problem comes from autonomous vehicles toeing the line while human-driven cars do not. The legal circus around putting a computer at the wheel is so complex, manufacturers are for the most part incredibly cautious about how their test cars drive.

After all, when the first autonomous car crash happens – and, make no mistake, it's a case of "when" not "if" – the lawyer fees involves will be phenomenal, as auto-maker, owner, insurance companies, and various other interested parties weigh in to figure out just who might be to blame and who should foot the bill.

Advertisement

No surprise, then, that the self-driving we've seen so far errs on the side of conservatism. Problem is, even with the threat of traffic cops and speed cameras, many human drivers don't do the same.

Cars that come to a complete stop at a junction, and that wait a few seconds after the lights have changed before pulling away, have already seen Google repairing bumpers after human drivers – not used to their fellow road-users following the letter of the law – rear-ended them.

Meanwhile others have figured out that swerving suddenly in front of an autonomous car is a useful way to make it slam on the brakes, if you've left merging into the lane too late. In a recent demo of a work-in-progress Honda traffic jam assist system, which can track cars ahead and follow road markings, the engineers seemed surprised when it was pointed out that opportunistic human drivers would quickly take advantage of the too-generous minimum distance left between the prototype and the vehicle in front.

Considering that, when that happened, the Honda automatically slowed to re-open the gap, you could easily find yourself stuck in traffic a lot longer than you really needed to be.

To the engineers, though, the thought of further reducing that gap – as a human driver might – was anathema: it would be, they argued, a compromise on safety. I've observed the same when it comes to the minimum distance most adaptive cruise control systems can be set to; even the lowest level starts to feel unduly cautious on the freeway.

Advertisement

Cutting off a Google car when you're late for work mightn't sound like too big a deal, but if traditional drivers end up feeling encouraged to be more risky on the road, the potential safety benefits from autonomy could end up ironically diluted.

The answer might be in making cars that are smarter in how dumbed-down they drive. HERE, the mapping company which Audi, BMW, and Mercedes-Benz are in the process of acquiring from Nokia, has previously argued that "humanized driving" is more appropriate for the real world than the rolling equivalent of the DMV handbook.

Such a vehicle might learn from the habits, patterns, and driving styles of its owner, and then gradually mimic them itself. That could help reduce passenger discomfort, since it would be a more familiar experiences, but also prove less "other-worldly" to the rest of traffic.

Another possibility is improving communication between a self-driving car and other road users. Mercedes-Benz, for instance, has been exploring visual cues by which humans on foot, on bikes, and at the wheel might get a better idea of what the computerized driver intends, given the absence of the usual eye-contact, hand signals, and other gestural indications we commonly rely upon.

Advertisement

The Germany company's recent autonomous concepts have included large, LED-encrusted grilles both front and back, which can be used to flash up messages like "STOP" or even reflect the movement of a pedestrian passing in front of the car to those drivers waiting behind. The F 015 Luxury in Motion went one stage further, with a laser integrated into the bumper that could project a virtual crosswalk with which to invite those on foot to pass while safe in the knowledge they'd been spotted.

At this point, manufacturers find themselves in a Catch 22 situation. If they build their vehicles to follow the law as printed, they end up sending out autonomous drivers into a world where few others actually observe the same regulations.

If they were to program in more human-style behaviors, such as exceeding posted speed limits and making rolling stops at junctions, they open themselves to legal controversy.

Having talked to many of the teams behind the current crop of autonomous projects, it's clear that none of them have figured out the "correct" answer. Self-driving cars finding a place on the road en-masse may end up depending less on creating the perfect computerized pilot, and more on understanding how to drive badly, safely.

Advertisement

Recommended

Advertisement