Call for a free case evaluation
AlexSideBar

A Guide to Liability in Self-Driving Car Crashes

The vehicles we share the road with are getting smarter by the day. Features that used to sound like science fiction are now standard on highway commutes. But when a crash happens involving a car that was supposedly “self-driving,” personal injury claims become incredibly complex.

Who pays for your medical bills, property damage, and lost wages? Is it the person sitting in the driver’s seat, or the multinational corporation that wrote the software?

The legal answer almost entirely depends on the vehicle’s specific “Level” of automation. Let’s break down the critical differences between supervised systems and fully autonomous technology, and explore exactly where the law places the blame.

Level 2 and Level 2+: The “Supervised” Era

When you see a self-driving feature on the road today, it is overwhelmingly likely to be a Level 2 or Level 2+ system.

The Technology: This category includes well-known systems like Tesla’s “Autopilot” and “Full Self-Driving” (FSD), GM’s “Super Cruise,” and Ford’s “BlueCruise.” These systems can simultaneously control steering, acceleration, and braking. Despite names like “Full Self-Driving,” these systems require active supervision. The human behind the wheel must keep their eyes on the road and their hands on (or hovering very near) the steering wheel, ready to take over at a moment’s notice.

Who is Liable?

In a Level 2 or 2+ vehicle, liability remains firmly on the human driver. If a Tesla on Autopilot rear-ends you, the law views the Tesla driver as negligent for failing to brake or steer away. The manufacturer is generally shielded from standard negligence claims because the driver is legally considered the ultimate operator.

Exception: Liability can occasionally extend to the manufacturer through a product liability claim, but only if the injured party can prove the technology failed or caused the accident due to a malfunction or failure to allow driver takeover.

Level 3: The Liability Game-Changer

Level 3 autonomy, known as “conditional automation,” represents a massive shift in both technology and personal injury law.

The Technology: Mercedes-Benz recently introduced “Drive Pilot” to the U.S. market, becoming the first automaker to achieve certified Level 3 status. Under specific conditions (such as heavy traffic on approved highways at speeds under 40 mph) the car handles all aspects of driving. The person in the driver’s seat does not need to actively supervise the vehicle. You can legally take your eyes off the road to watch a movie or read a book. You simply must be available to take over if the system gives you a warning prompt.

Who is Liable?

Because the driver is legally allowed to check out, liability generally shifts to the manufacturer, but it may also rest with the driver. Mercedes-Benz has publicly stated that they will assume legal responsibility for crashes that occur while Drive Pilot is actively engaged. However, in instances where the system fails to recognize a hazard, or if the driver ignores a prompt to take back control, liability becomes less clear. Figuring out exactly how fault is shared in these circumstances is a rapidly emerging area of the law. As these cases hit the courts, we will see precedent set on how blame is divided between the human and the machine.

Level 4: Fully Driverless Robotaxis

We are already seeing Level 4 autonomy operating in the real world with companies like Waymo deploying fully driverless taxis in cities like Miami, San Francisco, Los Angeles, and Austin.

The Technology: These vehicles operate without any human driver inside. They are geofenced to specific areas and handle all driving tasks entirely on their own.

Who is Liable?

Because there is no human operator to blame, liability rests entirely on the operating company and the manufacturer. If a Waymo vehicle makes an error that results in a pedestrian injury or a collision with another car, the corporate entity operating the autonomous fleet is the primary target for personal injury claims.

At a Glance: The Shifting Burden of Liability

Autonomy Level Common Examples Driver’s Legal Responsibility Primary At-Fault Party in a Crash
Level 2 / 2+ Tesla, GM, Ford Active, constant supervision The Human Driver
Level 3 Mercedes-Benz Must take over only when prompted The Manufacturer or The Driver (Emerging legal area)
Level 4 Waymo Passenger only (No driver needed) The Operating Company

Frequently Asked Questions

If My Car Is in “Full Self-Driving” Mode and I Cause a Crash, Am I Still Responsible?

Yes. If your vehicle is classified as Level 2 or Level 2+ (like Tesla’s FSD or GM’s Super Cruise), you are legally the operator. Even though the car is steering and braking, the law requires you to maintain constant supervision. Because you are expected to intervene to prevent a collision, you remain the primary at-fault party for medical bills and property damage.

How Does Level 3 Automation Change Who Is at Fault?

Level 3 is a legal “game-changer” because it allows the driver to legally disengage from driving under specific conditions. For example, in a Mercedes-Benz equipped with Drive Pilot, the manufacturer has stated they will assume liability for crashes that occur while the system is active. However, if the car prompts you to take control and you fail to do so, the blame may shift back to you or be shared between you and the manufacturer.

If a Driverless Robotaxi Hits Me, Who Do I Hold Accountable?

In a Level 4 accident (such as those involving Waymo), there is no human “driver” to sue. Liability rests entirely with the operating company and the manufacturer. Because these vehicles are designed to handle all driving tasks within a specific area, any failure is viewed as a product or operational failure rather than human negligence.

What Kind of Evidence Is Needed to Prove Fault in an Av Accident?

Unlike standard car accidents that rely on eyewitnesses and skid marks, AV cases rely heavily on digital evidence. To determine liability, your legal team will need to secure:

Data Logs: To see if the system was engaged.

Sensor Data: To see what the car “saw” (or missed).

Software Versions: To check for known bugs or missed updates.

Internal Cameras: To determine if the driver was properly supervising the vehicle (in Level 2 systems).

Can I Still File a Claim Against a Car Manufacturer for a Level 2 Crash?

While the driver is usually the primary target in Level 2 crashes, you can occasionally file a product liability claim against the manufacturer. To be successful, you must prove that the technology didn’t just “fail to prevent” the crash, but actively caused it through a mechanical malfunction or a software defect that prevented the human driver from taking over.

What to Do If You Are Involved in an AV Crash

If you are injured in an accident involving a vehicle with self-driving features, the immediate steps remain the same: seek medical attention and call the police.

However, your legal strategy will look very different. Proving fault in an autonomous vehicle crash requires securing data logs, software versions, and sensor information from the vehicle to determine exactly what the car and human was doing in the seconds before impact. Because you may end up facing the legal teams of massive tech or automotive companies, consulting with a personal injury attorney who understands the nuances of autonomous driving laws is essential to protecting your rights.

Sharing is Caring!

LinkedIn
Facebook
Pinterest
Twitter
WhatsApp

Sharing is Caring!

Free Consultation

Fill out the form below and we will get back will you shortly.  Fields labeled with an asterisk are required.






    Contact Alex

    Fill out the form below and I will get back with you as soon as possible.





      Search Our Website

      Enter some keywords into the search bar below and click the search icon