For years, autonomous vehicles have operated in relative obscurity. With few vehicles on the road and a laissez-faire attitude among government regulators, automakers and big tech firms have been free to test — and even commercially deploy — with little oversight.
Well, those days are done. In rapid succession, the National Highway Traffic Safety Administration (NHTSA) has opened investigations into almost all the major companies testing autonomous vehicles as well as those that offer advanced driver-assist systems in their production cars. Tesla, Ford, Waymo, Cruise, and Zoox are all being probed for alleged safety lapses, with the agency examining hundreds of crashes, some of which have been fatal.
In rapid succession, NHTSA opened investigations into almost all the major companies testing autonomous vehicles
The new investigations signal a new — and perhaps more antagonistic — phase in the relationship between safety regulators and the private sector. The government is requiring more data from companies, especially around crashes, in order to determine whether the industry’s safety claims live up to their hype. And the companies are finding that the proliferation of smartphones with cameras is working against them, as more videos of their vehicles behaving unpredictably go viral.
In 2021, NHTSA issued a standing general order requiring car companies to report crashes involving autonomous vehicles (AVs) as well as Level 2 driver-assist systems found in hundreds of thousands of vehicles on the road today. Companies are now required to document collisions when ADAS and automated technologies were in use within 30 seconds of impact.
NHTSA is seeing these crash reports in real time, enabling the agency’s Office of Defects Investigation to make connections between various incidents and determine whether more scrutiny is warranted. Meanwhile, videos of driverless vehicles operating erratically are also providing investigators with data they wouldn’t have had access to under the standing general order because they don’t involve collisions.
NHTSA specifically cited “other incidents… identified based on publicly available reports” in its investigation into Waymo’s driverless car system. The agency is looking into 22 incidents, some of which involve Waymo vehicles crashing into gates, chains, and parked cars. But NHTSA also cited viral videos of the company’s robotaxis operating on the wrong side of the road.
NHTSA is seeing these crash reports in real time
“I think NHTSA is responding to the Standing General Order data, as well as a never-ending stream of videos from the public, like the Waymo AVs choosing to go down the wrong way on a street,” said Mary “Missy” Cummings, a robotics expert and former senior safety advisor at NHTSA. “Indeed, the public is now filling a vital role in providing NHTSA information about near-misses, whereas the [standing general order] provides details about crashes.”
When NHTSA first announced the requirement that AV operators and automakers report crashes involving their vehicles, experts predicted the agency would be hamstrung because of the lack of standardization and context.
That’s because the data was missing a lot of key details, like the number of vehicle miles driven or the prominence of advanced driver-assist technology in each manufacturer’s fleet. NHTSA receives the data through varying sources, including customer complaints and different telematics systems. And companies are allowed to withhold certain details they consider to be “confidential business information.”
But with more robotaxis on the road, as well as the proliferation of Level 2 driver-assist features, investigators have shown a knack for overcoming the limitations of the crash data by requesting further information from the companies. And that’s going to inevitably lead to more tension.
“Indeed, the public is now filling a vital role in providing NHTSA information about near-misses”
“I definitely think the wider rollouts — and the corresponding videos of bad behavior—have made a difference,” said Sam Anthony, former chief technology officer at Perceptive Automata and author of a newsletter about automated driving.
“For years the companies making these vehicles have been able to trade on peoples’ assuming that they drive pretty much like regular cars,” he added. “There isn’t a broad understanding of how really different their perception is, and how that can make them fail in really unpredictable ways. As soon as people start to experience them on the road, they start to viscerally understand that.”
In particular, NHTSA’s willingness to reopen investigations into Tesla’s Autopilot shows that the agency has discovered a new fervor for oversight, Anthony said. Tesla issued a voluntary recall of Autopilot in the form of an over-the-air software update last year, in response to NHTSA’s investigation into dozens of crashes in which drivers were found to be misusing the system.
But Tesla’s recall wasn’t up to snuff, with NHTSA citing at least 20 crashes involving Tesla vehicles that had received the updated software. The agency is now reexamining the company and could conclude that Tesla’s driver-assist technology can’t be operated safely. That could be detrimental to Tesla’s stock price, as well as its future, with Elon Musk betting the company’s longevity on robotaxis.
“I’m optimistic that NHTSA’s newfound regulatory fervor is a sign of changes at the agency,” Anthony said. “It has been leaderless and, from my outside perspective at least, really demoralized for a long time. I’m hopeful that what’s happening now is an indicator that the people there are finally being given the tools and motivation to do their jobs.”