The National Transportation Safety Board (NTSB) said on Tuesday it is investigating a newly reported January incident in which Waymo self-driving vehicles passed a stopped school bus with its lights activated, an action that would violate Texas law.
The unit of Alphabet previously recalled its self-driving fleet in December after Texas authorities reported that Waymo vehicles had illegally passed school buses at least 19 times since the start of the school year. The NTSB statement identified an incident that occurred in Austin, Texas, on Jan. 12 while a school bus was loading passengers; that incident is under investigation.
The NTSB also said it is aware of a Jan. 14 encounter that involved a Waymo vehicle and a 2023 International school bus operating on a special-needs route. According to the NTSB, in that Jan. 14 case the Waymo initially stopped for the school bus, but then other vehicles passed the bus. The Waymo system subsequently asked a human remote assistance operator, "a school bus with active signals?" The operator replied no, and the Waymo vehicle then proceeded to pass the stopped bus. The NTSB said it plans to issue safety recommendations intended to prevent similar events.
Waymo said it appreciates the work of the NTSB. The company has previously attributed part of the December incidents to a software issue that contributed to the self-driving vehicles first slowing or stopping for a school bus and then proceeding.
Federal scrutiny of the matter is not new. The National Highway Traffic Safety Administration (NHTSA) opened a probe in October into Waymo vehicles operating near school buses. The Austin Independent School District reported that five incidents occurred in November after Waymo issued an earlier software update intended to address the problem. The school district had requested last year that Waymo suspend operations around schools during pick-up and drop-off times until the company could ensure its vehicles would not violate state law; Waymo did not comply with that request, the school system said.
Separately, both NHTSA and the NTSB are investigating a Jan. 23 collision in Santa Monica, California, in which a Waymo self-driving vehicle struck a nine-year-old girl who ran across the street from behind a double-parked SUV toward the school. Waymo reported that the vehicle immediately detected the child and applied hard braking, reducing speed from approximately 17 mph to under 6 mph before contact was made.
In addition to the incidents and probes, a promotional segment included with reporting asked, "Should you be buying GOOGL right now?" and described ProPicks AI as evaluating GOOGL alongside other companies using more than 100 financial metrics and highlighting that the AI identifies stock ideas based on fundamentals, momentum and valuation. That segment noted the AI has no bias and referenced past winners, and invited readers to see if GOOGL is featured in any ProPicks AI strategies.
Context and current status
The investigations by the NTSB and NHTSA are ongoing. The NTSB has signaled its intent to make safety recommendations following its review of the incidents. Waymo has acknowledged a software-related contribution to prior December behaviors and has expressed appreciation for the NTSB's investigative work. Local school authorities in Austin have reported multiple incidents and requested operational limits that Waymo did not adopt, according to the school district.