When a crash occurs, auto insurance usually follows a clear process: determine who caused the accident and pay damages.
But things get complicated when driver-assistance systems like Tesla Autopilot, Ford BlueCruise, GM Super Cruise, or driverless cars like Waymo are involved.
Was the driver the human, the software, or both? The answer affects who pays — the driver’s insurer, the automaker’s insurer, a supplier, or a combination of them.
“Technology in modern vehicles is advancing faster than most laws can adapt,” writes attorney Alex Stalvey in his blog for the law firm Bannister, Wyatt & Stalvey LLC. “These systems blur the line between driver and machine. This has become especially prevalent in the courtroom and at the negotiating table in recent months.”
Vehicle “driver-assist” and self-driving technologies use the SAE (Society of Automotive Engineers) Levels of Driving Automation, from Level 0 to Level 5. Level 0 is no automation. Level 5 is full automation, which doesn’t exist on the road — yet.
Level 2 driver-assist systems are common in the market
Many cars today use Level 2 driver-assist systems, which involve partial or fully “hands-free” systems that the driver must supervise. The cars can steer, brake, and accelerate in certain situations.
The National Highway Traffic Safety Administration requires manufacturers and operators to report certain crashes involving automated driving systems (ADS) and Level 2 ADAS. This helps the agency find patterns and investigate possible defects.
But this reporting policy has spawned greater debate. A system labeled “assist” means the human driver is still legally responsible. But more plaintiffs now argue that the technology can lead to overreliance. They contend that the marketing of these systems implies the vehicle has some control and thus must be held responsible in accidents.
“We are planning toward a day when the lawsuits will be directed against hardware and technical builders,” says Rami Sneineh, vice president and licensed insurance producer at Insurance Navy in Chicago. “The physical operation of the car is being transferred to the data center. Liability is no longer on who struck the brakes, but who composed the code.”
SAE Levels of Driving Automation
A jury tags Tesla with punitive damages in an Autopilot crash
An example of this shift occurred following a 2019 Key Largo crash involving Tesla’s Autopilot. One vehicle occupant died, and a second was badly injured in the crash.
A jury found Tesla partly responsible for the accident, despite the vehicle having a Level 2 system. The courts awarded more than $240 million in damages, including $200 million in punitive damages, to the victims and their families.
The Tesla driver, who caused the accident, admitted he was distracted by his cell phone, but the jury still decided Tesla shared responsibility. The plaintiffs argued that Autopilot’s design and marketing played a role in the crash.
Punitive damages are rare in product cases like this. But courts and juries may be considering whether the technology can make drivers too relaxed. A 2024 study by the Insurance Institute for Highway Safety called on automakers to improve safeguards so that drivers don’t overly rely on automation for safety.
Ford BlueCruise and ‘stationary vehicle at night’ crashes
Federal regulators have focused on two crashes involving Ford’s BlueCruise driver-assist system. The two fatal crashes occurred at night and involved the Ford Mustang Mach-E. In both cases, the Mustangs hit stopped vehicles at highway speeds.
Regulators say they’re examining whether the system properly detects stationary vehicles and if its driver-monitoring safeguards are adequate. The investigation is ongoing.
Regulatory investigations aren’t the same as courtroom liability, but they can guide future cases. If investigators find a defect or a known problem that wasn’t fixed, it can lead to lawsuits and, in some cases, recalls.
Waymo and the ‘no driver to blame’ problem
In San Francisco, a cyclist sued when a passenger exiting a Waymo vehicle opened a door into a bike lane, striking her. The passenger contends that Waymo’s Level 4 “Safe Exit” system failed to prevent the incident.
The case is ongoing, with arguments revolving around whether Waymor or the passenger bears primary liability for the incident.
The NHTSA also closed a separate investigation into Waymo after reports of unexpected behavior and minor crashes. They noted that Waymo took steps, including recalls and software fixes, illustrating how “liability” can involve regulators and product safety processes, not just insurance claims.
“Shared responsibility is common in accidents, and when you’ve got something like self-driving cars in a crash, you can expect that it could involve all parties, and that will definitely complicate things,” Jae E. Lee, managing partner at Jae Lee Law in New Jersey, told Insurify.
“Today’s automated systems available to the general public don’t fully automate the process, so the driver is still required to pay attention and be ready to take control,” she says. “When they don’t, they’re at fault.”
If a manufacturer’s defect or a software issue causes an accident, the manufacturer is likely responsible, Lee says. “Our role in the process is using investigation and evidence to establish causation, which can determine if one or all of them are to blame.”
What this means for car insurance claims
While insurers are still the first to pay in most claims, experts say driver-assist technology can affect what happens after the initial claim:
Disputes over fault are becoming more technical, with greater demand for vehicle data, system status, camera logs, event data recorders, and phone records. The key questions are: Was the system on? Did it give a warning? Did the driver react?
If your insurer pays your claim but believes an automation system defect played a role, it may seek to recover costs from a manufacturer or supplier. This process can take months or years and may not change your immediate payout, but it can affect long-term costs.
Premiums and underwriting questions are under more pressure. As lawsuits and claims costs go up, insurers may ask for more details, such as what system you used, if your software was current, and if you followed the system’s limits.
“This is becoming a product liability issue that could involve layered responsibility,” Ryan Perdue, partner and trial attorney at Simon Perdue Law in California, told Insurify. “The data points make this case more complex because autonomous and semi-autonomous vehicles generate enormous amounts of information, from sensor logs to driver monitoring data, all of which becomes central evidence.”
“Litigation often becomes a battle between engineers and software experts rather than eyewitness testimony,” he says. “The questions become ‘What did the car know, and how did it decide?’ That shift has major implications for how juries understand fault and causation.”
Perdue says driverless systems could gradually push the insurance industry to a product-centered liability model. If the vehicle is operating autonomously, insurers may pursue manufacturers under product defect theories rather than focusing on the policyholder’s conduct.
“We might also start seeing hybrid frameworks where vehicles require human supervision but still require automated decision-making,” he says. “Until full autonomy becomes the norm, the outcome will likely be shared liability arguments with each side pointing to the other’s role in the chain of events.”
What’s next? Practical takeaways for drivers
Experts say drivers operating vehicles that use driver assistance should:
Think of it as an advanced cruise control, not a self-driving system. Regulators and courts still consider humans responsible for Level 2 systems.
After a crash, document whether the system was on, what prompts they saw, and the road conditions.
Don’t assume saying “the car was driving” will remove responsibility. Cases thus far have trended toward shared fault. The Tesla verdict shows that courts can hold manufacturers accountable, but drivers are still usually responsible as well.
The bigger picture is that the U.S. is slowly creating new liability rules as cases happen, through jury verdicts, NHTSA investigations, and more crash reports. Until the law becomes clearer, this debate will keep showing up in insurance claims and, more often, in courtrooms.

Commented
Sorry, there are no recent results for popular commented articles.