Self-Driving Cars Keep Getting Into Hit-and-Runs — As Victims

A report found that human drivers in San Francisco keep crashing into autonomous cars, and fleeing the scene when they see nobody at the wheel.

We may earn a commission from links on this page.
A photo of a Cruise self-driving car in San Francisco.
There were 36 hit-and-run incidents involving self-driving cars in 2022.
Photo: Bloomberg / Contributor (Getty Images)

Proponents of autonomous vehicles keep telling us they’re the future of transport — safer than human drivers, more reliable, and a solution to all the time we waste stuck in traffic. But before they can become a silver bullet to all our transport woes, they’re going to have to get used to that other menace on the road: human drivers.

In case you missed it:

This is a harsh lesson some autonomous taxi services are learning as they put their self-driving cars to work in states like California. That’s because these autonomous taxis are increasingly ending up in hit-and-run accidents as they traverse cities like San Francisco. The twist here is that the autonomous cars are usually the victim, not the perpetrator of the crash.

Advertisement

According to a new report from NBC News, cars from the likes of Waymo and Cruise, as well as other tech firms like Apple and Amazon-backed Zoox, are increasingly ending up in hit-and-run crashes caused by human drivers. One example from the report saw a Chevrolet Bolt from GM-backed Cruise tangle with an Infiniti Q50 performing donuts on the streets of San Francisco. The driver of the Infiniti fled the scene, leaving the Cruise car battered and bruised.

A photo of a Waymo self-driving car in San Francisco.
Show me the Waymo to go home.
Photo: Justin Sullivan (Getty Images)
Advertisement

This is an increasingly common occurrence in San Francisco, NBC reports, as there were 36 hit-and-run crashes between self-driving cars and human drivers in 2022. So far this year, there have been seven such crashes where the human driver fled the scene. NBC reports:

The hit-and-runs pose a problem for driverless technology and its future: Even when self-driving cars are programmed to do everything right, it can be hard to avoid the mistakes of human drivers.

Advertisement

So far, at least three incidents between self-driving cars and human drivers have resulted in injuries to the autonomous car’s occupants. NBC reports that in all three of the reported cases, “the drivers of the other cars left the scene without exchanging information.”

In one of the reported cases, a Cruise vehicle in autonomous mode was rear-ended twice by a Honda driver in Golden Gate Park. The car was stopped at a red light with two Cruise employees inside “when the Honda driver bumped it from behind,” then, the report continues, “the Honda driver reversed backward several feet, stopped and drove forward again, making contact with the Cruise vehicle a second time.”

Advertisement
A photo of a Zoox self-driving car in San Francisco.
Bam-Zoox-Ki.
Photo: Bloomberg / Contributor (Getty Images)

In California, as is the case in most civilized places, if you’re involved in a crash, you’re expected to get out and exchange insurance and contact details with the other party. But if the other party is a self-driving car, who do you exchange information with? This is a query that has occurred to Cruise, as a spokesperson for the company told NBC news:

Most people want to do the right thing and exchange contact information but given interacting with an AV is novel for many people we want to ensure they have an easily identifiable contact number displayed on the outside of the AV.

Advertisement

San Francisco police also said in a statement shared with NBC News that it was its current policy to “document and investigate all collisions involving autonomous vehicles.” In spite of these efforts to report incidents involving self-driving cars, many of the drivers at fault are getting away with little punishment for their actions.

In California, a driver that leaves the scene of a collision could be prosecuted for a misdemeanor, and if someone was injured in the crash they could face a felony charge punishable by up to five years in prison.

Advertisement

So far, San Francisco police told NBC News they “didn’t know” if any of the reported crashes had been followed up with criminal charges for those involved.

A photo of a white Waymo self-driving car.
Is this the Waymo to Amarillo?
Photo: Justin Sullivan (Getty Images)
Advertisement

Self-driving company Waymo, however, said it was keeping its “options open about how to respond” to such incidents. In a statement, the startup told NBC News that it “may report the event to law enforcement,” if one of its cars was the “victim of a crime.”

San Francisco is a proving ground for autonomous cars at the moment. And that means it should also be the place where legislation and common courtesy around self-driving vehicles are also fine-tuned.

Advertisement

If not, when more and more autonomous cars head out on U.S. roads, lawmakers are going to have to struggle to come to terms with how to handle these kinds of crashes on a state-by-state basis, and that’s sure to lead to confusion.