Tech

US Gov Says Tesla Self-Driving May Cause Crashes, Issues Recall

The safety regulators issued a recall affecting 362,000 cars, but it also highlights the shortcomings of the current recall system.
GettyImages-1247204460
Gary Coronado / Contributor via Getty
Screen Shot 2021-02-24 at 3
Moveable explores the future of transportation, infrastructure, energy, and cities.

A federal safety regulator has determined that Tesla’s Full Self-Driving Beta software “increases the risk of a crash” and issued a recall notice for the approximately 362,000 vehicles equipped with the software on Thursday. The recall is both a win for safety advocates who have long been warning that the beta software is unsafe and Tesla supporters who will point to the relatively minor nature of the safety concerns. It also highlights the shortcomings of auto regulations for a software-driven world.

Advertisement

It has hardly been a secret that FSD Beta is not safe on city roads. It has been readily apparent since Youtubers first posted videos of themselves using the software in early 2021, even though the people posting the videos are usually Elon Musk’s biggest fans. The videos have shown cars almost hitting pedestrians, veering into cyclists, driving into pillars, and other patently dangerous behavior. In the videos, the drivers intervene to prevent a crash, because they are essentially Tesla volunteers who paid thousands of dollars for the privilege of being company safety drivers testing a beta self-driving car software on public roads. 

The National Highway Traffic Safety Administration, an agency within the Department of Transportation, has been investigating Autopilot and FSD Beta for years. According to the recall report, NHTSA identified “potential concerns related to certain operational characteristics” about FSD Beta on January 23, 2023. Specifically, the car “in certain rare circumstances” can drive through a “stale yellow traffic lights”—it is technically illegal to enter an intersection while a light is yellow—failing to adjust speed when entering territory with a new speed limit quickly enough, and weird behavior around lane changes in certain situations. According to the recall report, Tesla, “while not concurring with the agency’s analysis,” will administer a voluntary recall “out of an abundance of caution.” 

While “recall” sounds severe, it is just Tesla saying it will issue a new software version of FSD Beta addressing these concerns. It is not a revoking of FSD Beta, a seizure of all Teslas, or any other drastic policy.

As with virtually every other issue touching Tesla and Musk, this recall will confirm whatever preconceived notions they have about FSD Beta. Its supporters largely mocked or derided the news as further evidence regulators are behind the times, wasting Tesla’s time with nitpicky nonsense. 

For Tesla’s critics, it is further confirmation that self-driving software should not be rolled out to hundreds of thousands of vehicles without pre-approval from regulators which is what happens in Europe. In the venn diagram of the opposing views of FSD Beta, the overlapping middle is that the recall system is too slow and reactionary for over-the-air updated software systems.