Who Deserves The Blame When Tesla Autopilot Crashes? Here's Everything We Know

Tesla's self-driving tech is once again in the spotlight. According to an analysis by The Washington Post that cites camera footage and regulatory data, there have been close to 40 crash incidents involving Tesla's autopilot tech that resulted in a fatality or serious injuries. Eight of the incidents reportedly happened on roads where the driver assistance system is not supposed to be used. This situation-specific caveat is said to be the backbone of roughly 10 lawsuits targeting Tesla that will go to court next year.

Advertisement

The incidents, however, seem to be a tale of negligence as much as they are cautionary tales of systemic failure. Following a fatal crash in 2016, the National Transportation Safety Board (NTSB) asked the National Highway Traffic Safety Administration (NHTSA) to draft rules that would put a limit on when and where tech like Autopilot can't be enabled. Tesla reportedly told NTSB following a deadly 2018 incident that it doesn't think it would be appropriate to limit Autopilot, adding that "the driver determines the acceptable operating environment."

Following similar incidents in 2020 and 2021, the NTSB, which lacks regulatory power, again asked NHTSA to draft rules preventing driver assistance tech like Autopilot from working in environments it was not designed for. However, nothing has come out of that request. NTSB chair Jennifer Homendy reportedly reached out to Musk with the same request but never heard from him.

Advertisement

Interestingly, as part of a sworn deposition in 2022 that Reuters first reported, Tesla's Autopilot head Ashok Elluswamy claimed not to know about a system regarding environment-aware disabling of Autopilot. He further noted that drivers could "fool the system," further escalating Tesla's supposed complicity in not fixing a dangerous system. But at the same time, it also targets the driver for doing so.

What Tesla says about Autopilot?

If one were to go by the textbook definition of autopilot, it is a "hands-on driver assistance system" that only gives a partial automation overhaul to tasks such as steering, parking, and lane change, among others. Tesla makes it abundantly clear that you must "keep your hands on the steering wheel at all times," and the person behind the wheel has to "maintain control and responsibility for your vehicle." Tesla flashes these warnings every time the autopilot system is activated.

Advertisement

But even before its activation, Tesla expects you to read and acknowledge all the perks and limitations of the autosteer system. Even for lane markings, there's an exhaustive list where autosteer can fail, which includes excessively worn lane markings, visible previous markings, adjustments due to road construction, rapidly changing lane markings, shadows cast on lane markings, and pavement seams or high-contrast lines.

It should not be used in weather conditions where sensor function could be affected, such as heavy rain, snow, and fog. It is also not suitable for driving on hills, narrow, high curvature, or winding roads, extremely hot or cold temperatures, if another electronic device interferes with the sensors, bright light obstructs the camera, or on roads where there is mud, ice, or snow. Even the application of excessive paint or adhesive-based items like wraps, stickers, and rubber coating can interfere with autopilot functions.

Advertisement

Tesla warns drivers repeatedly over negligence and can even block autopilot for that trip after repeated violations. Consumer psychology towards a tech named autopilot or autosteer would have them letting the car take control instead of always keeping a hand on the wheels and staying alert, as Tesla suggests. It's almost counter-intuitive to expect full compliance when research suggests customer ignorance of user manuals.

Is Tesla to blame here?

It isn't surprising that multiple lawsuits target Tesla for alleged false advertisements regarding the safety of its Autopilot system. On the other hand, Tesla counters that the drivers have been warned to abide by the Autopilot instructions that tell them to always remain in control of their car and that every warning and detail is supplied to them via the official instruction manual. So far, Tesla's argument has prevailed in courts, even though it has suffered its share of setbacks.

Advertisement

In November 2023, a Florida court concluded that "reasonable evidence" suggests that Musk and other high-ranking executives were aware of a malfunctioning Autopilot system in the company's vehicles. Yet, they permitted the operation of these cars in a potentially hazardous manner. Earlier in the same month, a civil court in Riverside County ruled that Tesla's software was not to blame in a fatal crash lawsuit that sought $400 million in compensation from the company.

A few months before that, in April 2023, a California state court gave a victorious verdict to Tesla in another lawsuit claiming that the Autopilot design was responsible for a serious crash. The court sided with the car's driver assist system, which comes with clear limitation warnings, but it was the distracted driver who was to blame. It was the second blockbuster case in which Tesla avoided liability, noting that Autopilot isn't a self-piloted system and always requires driver attention.

Advertisement

Experts say these cases pave the way for Tesla to win favorable decisions for similar future legal battles. But the numbers keep piling up. An investigation revealed that Tesla cars make the majority of deaths in incidents involving driver assistance systems. The report linked over 700 crash incidents to Autopilot, of which 17 involved a casualty. Notably, the number of accidents shot up soon after Tesla decided to ditch radar sensors in favor of camera-based environmental awareness for Tesla cars. 

Accidents in the post-radar era make up two-thirds of all Autopilot and FSD-related incidents. Tesla's shift was interesting because all major names experimenting with advanced self-driving tech still rely on sophisticated systems of radar array because they are less prone to failure than RGB cameras in situations like rain and fog. In the same month, the NHTSA launched its own investigation into Tesla crashes involving ADS tech.

Moving too fast?

Tesla cars come with a system called Safety Score that evaluates your driving patterns using various measures such as frequency of speeding over 85 miles per hour, time spent driving without a buckle, hard cornering, aggressive braking, and forced disabling of autopilot after three system warnings. Combined, these factors estimate the potential risk of your driving, leading to future accidents. The company uses it to calculate the Tesla insurance premium. But there is another crucial aspect where Tesla lowered the Safety Score baseline, which doesn't sound good given the controversial status of its self-driving tech.

Advertisement

Tesla had initially set a benchmark of 100/100 safety score to grant access to its Full Self-Driving beta package back in 2021. Over the years, that requirement has been lowered significantly. A year later, some users reported they got the FSD Beta update on a safety score as low as 62 out of 100. Tesla had another eligibility criterion for seeding the ambitious bundle. Initially, FSD was only seeded to users with over 100 Autopilot miles, but that touchstone was also degraded apparently. One user reported having only 28 Autopilot miles but still got the FSD Beta update on their Model 3.

Both Tesla and Musk have made grand promises regarding the safety guardrails of the brand's self-driving system. Given the vast dataset, sheer computing power at its disposal, and the progress level of rivals, one is inclined to believe Tesla. Then, if the tech was that good, there wouldn't be allegations of Tesla staging a Full Self-Driving demo video. Tesla whistleblower Lukasz Krupski told BBC that the autopilot hardware stack is not ready. "It affects all of us because we are essentially experiments in public roads," he pointed.

Advertisement

The Autopilot issues are not just serious safety risks on their own, but it's the massive scale of deployment that multiplies the concerns raised by activists as well as agencies like the NTSB. The latest example is Tesla's recall of a massive two million vehicles in December 2023 to fix Autopilot safety issues highlighted by the NHTSA.

Can Tesla be held liable?

As described above, Tesla has informed — and warned — customers about the conditions that are not optimal for enabling Autopilot. The company even takes measures such as deactivating Autopilot for the entire ride after violations. But that begs the question: if and when a Tesla car comes across a safe condition for Autopilot, why doesn't it deactivate Autopilot entirely? Former NHTSA administrator Steven Cliff also raised the same question in an interview with The Washington Post.

Advertisement

In a situation where Tesla can be held liable for Autopilot-related crashes, it is far from straightforward. Mark Geistfeld, a professor of Civil Litigation at New York University, notes that carmakers should consider user potential complacency that can arise from the use of driving technology during the design process of their vehicles. Once again, the liability situation arises when ADS tech goes mainstream and Tesla markets them as truly self-driving instead of pushing it as a half-human-reliant system.

"Once these vehicles become fully autonomous, then there's just the car. The human in the car isn't even an element in the situation," Geistfeld says. Experts at Yosha Law also point out that in the current scenario, if an Autopilot system falters in scenarios such as not allowing the driver to take control, locked brakes, and steering erratically, it could make a potential case for liability. 

Advertisement

Tesla is also no stranger to such problems, as issues like phantom braking are already being investigated. Phantom braking refers to a system abnormality where a car tries to decelerate even if there is no obstacle in its path to warrant the application of brakes. The Washington Post reports that incidents of phantom braking are on the rise, even though multiple regulatory investigations have been launched into the issue.

Regarding insurance, a Reuters investigation found that Tesla's in-house insurance system covering Autopilot incidents is unreliable. We will have to wait until the NHTSA defines clear boundaries for systems like Autopilot and implements something like a type-approval pipeline where automakers need to certify the safety and limitations of its tech before public deployment.

Is it time for a self-driving test?

Despite numerous investigations, the NHTSA has yet to formulate a concrete pathway for pre-vetting advanced automobile techs such as Tesla's autopilot and the FSD systems. Experts say we are likely to wait until such driver assist systems become commonplace to see tighter regulations covering every aspect, from liability and insurance claims to the potential for lawsuits. So, what's the solution for that buffer period where lives are lost?

Advertisement

One doesn't need to look beyond the contradicting U.S. / E.U. norms for Automated Driving Systems (ADS). In August 2022, the E.U. tweaked its type approval system to include automated driving systems (ADS) of fully automated vehicles, which notes that a carmaker has to get a fresh nod from an authorized E.U. state/agency if they wish to roll out differentiating software updates or make any changes to the hardware. That's remarkable because Tesla, the most innovative name in the game, has yet to get approval for its FSD system in the E.U. Over the years, Tesla has had to tweak some Autopilot features in Europe to abide by regulations.

In the U.S., where carmakers follow a self-certification model, the NHTSA has yet to implement such measures, even though it pushed a draft framework for Automated Driving System Safety in 2020. In July 2023, NHTSA Acting Administrator Ann Carlson proposed the idea of the ADS-equipped Vehicle Safety Transparency and Evaluation Program (AV STEP). Still, it focuses more on an alternative route deployment for ADS cars instead of putting rules for existing cars with teeth in place. ADS cars have become hulking computers on wheels that get new tricks with updates, which only correct after a calamity. A pre-approval system with transparent self-reporting is the need of the moment, supplemented by pre-release checks.

Advertisement

Recommended

Advertisement