A recent crash involving a child pedestrian and a Waymo autonomous vehicle (AV) in California is drawing renewed national attention to the safety of fully driverless technology and the growing role of federal oversight. The incident comes as Waymo continues to expand operations in multiple states, including plans to bring driverless services to Dallas, Houston, and San Antonio.
For Texas residents following the rapid growth of autonomous vehicles, the California crash serves as another reminder that questions about safety, accountability, and liability remain unresolved, even as driverless cars become more common on public roads.
According to reporting by CNBC, a Waymo vehicle struck a young pedestrian on January 23 in Santa Monica, California. Waymo later confirmed that the pedestrian was a child who had entered the roadway from behind a tall SUV, directly into the vehicle’s path.
Waymo said its autonomous driving system detected the child as soon as she began to emerge and braked hard, reducing the vehicle’s speed from approximately 17 miles per hour to under 6 miles per hour before contact occurred. Following the crash, the child stood up and walked to the sidewalk; Waymo called 911, and the vehicle remained at the scene until law enforcement cleared it.
Waymo described the event as an example of how its safety systems are designed to reduce the severity of collisions. Still, the fact that a fully driverless vehicle struck a child near a school has intensified concerns about whether autonomous systems can reliably handle unpredictable pedestrian behavior, especially involving children.
NHTSA Opens New Investigation Into Waymo
Waymo confirmed that it voluntarily notified the National Highway Traffic Safety Administration (NHTSA) about the Santa Monica crash the same day it occurred. NHTSA has since opened an investigation to determine how Waymo’s automated driving system detected and responded to the pedestrian, whether the vehicle’s braking performance met safety expectations, and whether similar incidents involving Waymo vehicles have occurred.
This investigation is separate from but related to other federal reviews involving Waymo, including earlier scrutiny over how its vehicles respond to stopped school buses and other pedestrian environments.
NHTSA has emphasized that federal investigations are a critical tool for determining whether emerging vehicle technologies pose an unreasonable risk to public safety.
In a public statement, Waymo said its internal modeling shows that a fully attentive human driver in the same situation would likely have struck the child at approximately 14 miles per hour, more than double the speed at which the Waymo vehicle made contact. Despite those assurances, safety advocates note that avoiding contact altogether remains the ultimate goal, particularly in school zones and residential areas where children may behave unpredictably.
California Crash Follows School Bus Incidents in Texas
The California pedestrian crash follows troubling incidents closer to home in Texas.
As we previously reported, surveillance cameras on Austin Independent School District school buses captured multiple Waymo vehicles failing to stop for school buses with stop arms extended and red lights flashing, which are violations of Texas law intended to protect children.
District records showed Waymo vehicles were cited dozens of times for illegally passing stopped school buses, and some videos captured children near or in the roadway at the time of the violations.
Together, the Texas school bus incidents and the California pedestrian crash illustrate a consistent concern that autonomous vehicles may struggle in environments involving sudden pedestrian movement, school zones, and unpredictable human behavior.
Why Waymo Investigations Matter for Texas Drivers
Waymo has announced plans to expand its driverless operations in Texas. As those expansions approach, federal investigations into Waymo’s safety performance take on added importance for residents.
NHTSA investigations can lead to requests for additional data and software changes, safety recalls, operational restrictions, and broader rulemaking affecting autonomous vehicle design. For Texans, these federal actions may shape how and under what conditions driverless vehicles are allowed to operate on local roads.
Texas has historically maintained one of the most permissive legal environments for autonomous vehicles. From 2017 to 2025, companies could deploy driverless vehicles on public roads with relatively minimal state oversight.
That changed on September 1, 2025, when Senate Bill 2807 took effect, creating Texas’s first comprehensive authorization and oversight framework for autonomous vehicles. The law establishes statewide processes for AV authorization, safety planning, and regulatory enforcement.
Even with updated laws in place, AV incidents continue to raise difficult questions for car accident cases, such as who is legally responsible when a driverless vehicle hits a pedestrian, how software errors are evaluated in crash investigations, and what standards determine whether an AV system is reasonably safe.
Hurt in an Accident With a Waymo or Other AV?
The California child pedestrian crash and the Texas school bus incidents highlight the challenges of integrating autonomous vehicles into everyday traffic environments. When a crash involves a driverless vehicle, the legal issues can be far more complicated than a typical Texas car accident. Liability may involve vehicle manufacturers, software developers, fleet operators, or other entities.
The Cochran Firm Texas has decades of experience representing crash victims and helping families navigate complex injury claims, including those involving emerging vehicle technologies.
If you have questions about an accident involving a Waymo or another autonomous vehicle, contact The Cochran Firm Texas for a free consultation. You can also call us toll-free at (800) 843-3476 or use Live Chat to learn more about your rights and legal options.