Uber's Pedestrian-Killing Autonomous Car Reportedly Chose Not To Stop

The fatal collision involving Uber's autonomous car in Arizona earlier this year saw the vehicle's systems intentionally avoid swerving, insiders claim, having reportedly been programmed with a higher tolerance for unexpected items in the lane ahead. The crash took place in Tempe, AZ, in March of this year, with a driverless car in self-driving mode colliding with a pedestrian who stepped out unexpectedly into the road.

Advertisement

That pedestrian, 49-year-old Elaine Herzberg, was taken to hospital but later died of her injuries. Uber's vehicle, one of the modified Volvo XC90 SUVs that it has been using in Arizona for its real-world trials, was piloting itself using the ride-sharing company's software and hardware. A single safety operator in the driver's seat did not take the controls in time to prevent the crash.

The National Transportation Safety Board is working on an investigation, but two sources tell The Information that a preliminary conclusion within Uber has already been reached. According to its insiders, the SUV's systems had been set so that they "decided" evasive action was unnecessary.

That, it's suggested, could be because Herzberg's presence in the road along with her bicycle was deemed a "false positive." That might normally be a plastic bag, leaves, papers, or some other obstacle the car could safely ignore. However, the sources say, Uber reportedly set its threshold for recognition so low, in the hope of avoiding false positives being recorded and its cars reacting unnecessarily, that the car deemed Herzberg to be something that it could drive through, rather than driving around her.

Advertisement

Uber, in a statement, declined to comment specifically on the report. "We're actively cooperating with the NTSB in their investigation," the company said. "Out of respect for that process and the trust we've built with NTSB, we can't comment on the specifics of the incident. In the meantime, we have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture. Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon."

The NTSB does not look fondly on companies making public statements about investigations still in progress, something Tesla discovered earlier this year when it commented on a crash involving a Model X SUV operating on Autopilot. The NTSB responded by removing Tesla as a party to the investigation – its way of describing companies cooperating as potential experts – while Tesla, in turn, said it would complain to the US Congress about the organization's actions.

The leaks about Uber's crash hardly come at a good time. Earlier reports have suggested that the company had been scaling back on both hardware and human personnel involved in the testing, potentially in the attempt to find a lowest-possible-level of technology needed in order to make autonomous driving practical. In March, a report suggested Uber had cut the number of LIDAR laser range-finders on its prototype cars, the sensors which are used to build real-time 3D maps of the world around the vehicle.

Advertisement

Since the crash, Uber's autonomous fleet has been benched while the investigation is ongoing. It also opted not to renew its self-driving vehicle testing permit in California. In March, the company inked a settlement deal with Herzberg's family.

[Updated to clarify the reported adjustments to avoid false positives]

Recommended

Advertisement