Who Is Liable in a Waymo Crash?
Self-driving cars are here, but they’re not perfect. Learn how Waymo operates, why crashes happen, and who’s responsible when technology fails.
Teaching Tip 1 (Related to Article 3 — “Child Hit by Waymo Self-Driving Car”): “Who Is Liable in a Waymo Crash?”
For an article that addresses the parties potentially liable in a Waymo automobile accident, with a focus on the state of California and California law, please see the article, Legal Challenges in Autonomous Vehicle Accidents: Who is Liable in a Waymo Crash?, published by the El Dabe Ritter law firm, a personal injury firm representing seriously injured victims of negligence.
According to the article, if you have driven through Los Angeles lately, you have probably seen a Waymo, those white cars covered in sensors and cameras. These autonomous vehicles scan the road in every direction and make driving decisions without a human touching the wheel.
But this technology is not perfect. When a Waymo crashes, injured victims often face confusing questions about responsibility. Without a traditional driver to blame, determining who is liable becomes more complicated.
Understanding your rights is the first step toward protecting yourself.
How Waymo Works & Where It Operates
Before talking about fault, it helps to see how Waymo works. Instead of a driver, the vehicle relies on cameras, radar, LiDAR, and software to read traffic and avoid hazards.
While powerful, these systems can struggle with real-world challenges like construction, sudden pedestrian movement, or heavy rain. Waymo currently operates in several busy California cities, each with its own risks:
- Los Angeles — Waymo serves parts of Downtown L.A. and nearby neighborhoods, placing self-driving cars in some of the state’s most congested streets.
- San Francisco — has one of the largest Waymo service areas. Steep hills, tight intersections, and heavy foot traffic can lead to sudden stops or system errors.
- San Diego — Waymo continues to expand in San Diego, especially near downtown and the waterfront. Tourists, e-bikes, and high foot traffic can make these areas unpredictable.
Waymo Crash Trends
As Waymo expands, so do the reported accidents. Between 2023 and 2025, the company reported 154 crashes, a sign that even advanced systems can fail.
According to the manufacturer-reported crash data:
Daylight Waymo Crashes—61.7 percent of crashes happened in daylight, when visibility was clear.
Clear Weather Waymo Crashes—72.7 percent occurred during clear weather, not rain, fog, or other poor conditions.
These numbers show that many accidents occur under normal driving conditions. A Waymo might brake too hard for a small object, misread a construction area, or respond unpredictably to a cyclist, even when roads seem safe.
Why Waymo Vehicles Can Still Cause Accidents
Self-driving cars rely on sensors, cameras, and code to understand the world around them. But real roads are unpredictable. When something falls outside the system’s expectations, the car may react too slowly, too quickly, or in the wrong way.
Common Situations That Lead to Waymo Crashes
Waymo crashes usually happen when the system encounters something outside its expectations. Some of the most common problems include:
- Misread Signals — Trouble interpreting traffic lights, bus signals, or another driver's turn indicator.
- Abrupt Braking —Sudden stops that increase the risk of rear-end collisions.
- Unpredictable Human Actions — Speeding, last-second turns, or running red lights.
- Difficulty Merging — Hesitating or making unsafe moves when joining fast-moving traffic.
- Unusual Road Scenarios — Freezing or reacting unpredictably near school buses, emergency vehicles, construction zones, or odd intersections.
Even at low speeds, these issues can lead to sharp turns, hard braking, and injuries to passengers and people nearby.
Who Is Liable in a Waymo Crash?
Understanding who may be liable in a Waymo crash is more complex than in a typical car accident, because responsibility can involve both human actions and failures in the autonomous driving system.
Your self-driving car accident lawyer will look at what failed, how the AV responded, and whether another driver contributed to the crash. Below are the parties that could be responsible for your injuries:
Waymo and Its Technology Partners
Waymo, or the companies that design, build, or maintain its self-driving system, may be responsible when a crash occurs because the technology fails.
California law places clear safety obligations on autonomous vehicle manufacturers, requiring their systems to meet state and federal standards and operate safely on public roads.
Technology-related failures can include:
- Faulty sensors or cameras (LiDAR, radar, or optical sensors failing to detect hazards)
- Software errors or defective code that misjudge speed, distance, or timing
- Incorrect system decisions, such as braking too late or failing to yield
- Mechanical failures, including braking issues or steering malfunctions
- Maintenance mistakes, like skipped updates or improper calibration
- Defective components supplied by third-party manufacturers
California law requires autonomous vehicles to record crash data, which often helps determine whether Waymo or its technology partners are responsible. Companies that design or maintain unsafe autonomous systems can be held liable for resulting injuries.
Other Drivers on the Road
Even with a self-driving car involved, another driver may be the one who caused the crash. Their unsafe choices can put everyone, including the Waymo passengers, at risk.
A human driver may be at fault if they:
- Speed or drive distracted
- Run a red light
- Make an illegal turn
- Rear-end the Waymo vehicle
In these cases, the claim is handled much like a normal car accident, but Waymo’s onboard data can help show what happened and whether the autonomous car reacted correctly.
Shared Fault in Unexpected Road Situations
Some crashes involve sudden actions that no vehicle, human-driven or autonomous, can fully avoid. In these situations, fault may be shared between the Waymo system and the person who acted unpredictably.
In these situations, the key question becomes whether the AV had enough time and information to respond safely.
Examples of shared-fault scenarios include:
- Entering traffic suddenly
- Crossing outside of a marked crosswalk
- Swerving into the Waymo’s path
Waymo’s internal logs show how quickly the system detected the person and whether it reacted properly, helping determine liability.
Passenger Interference
Although passengers usually have no control over a fully autonomous Waymo, there are rare situations where a rider can contribute to a crash.
This can happen if the passenger:
- Hits the emergency stop button without reason
- Blocks cameras or sensors
- Ignores on-screen instructions
- Interferes with the system during a manual-assist mode
During testing, a safety operator may also be responsible if they fail to correct the vehicle when required.