The U.S. government has once again set its sights on Tesla and its semi-autonomous driving software, as the National Highway Traffic Safety Administration (NHTSA) announced a new investigation on Friday. This probe covers a whopping 2.4 million Tesla vehicles that are equipped with the brand’s hands-free driving system, known as Full Self-Driving (FSD). The investigation was prompted by reports of four collisions involving the system, including a tragic fatal crash in 2023.
The NHTSA launched this investigation following a series of crashes that were attributed to reduced visibility on the road while FSD was in use. These incidents occurred in everyday conditions such as sun glare, fog, and airborne dust, highlighting the challenges that drivers may face when relying on autonomous driving technology.
One of the most alarming incidents involved a 2021 Tesla Model Y equipped with FSD that resulted in the death of a pedestrian in Rimrock, Arizona. Another crash caused injuries, underscoring the potential dangers of relying too heavily on driver-assistance systems.
The 2.4 million vehicles under investigation cover most of the newer Tesla models that come with the optional FSD software. This software has seen several price cuts over the years, from $15,000 in 2023 to $4,500 currently. The models being investigated include the Tesla Model S, Model X, Model 3, Model Y, and Cybertruck.
The NHTSA’s investigations often lead to recalls if the automaker is found to be out of compliance or if the vehicles pose a significant risk to safety. In the past, similar probes have resulted in recalls, such as the one in February 2023 when Tesla recalled over 360,000 vehicles with a previous version of the FSD software known as Full Self-Driving (Beta).
In December 2023, another NHTSA investigation led to a massive recall of over 2 million Tesla vehicles equipped with the Autopilot semi-autonomous driving system. Autopilot, like GM’s Super Cruise and Ford’s BlueCruise, is classified as a Level 2 driver-assistance system by the SAE.
The FSD system, which almost reaches Level 3 autonomy, has faced criticism for relying solely on cameras instead of more advanced sensing technologies like lidar. This cost-cutting measure has raised concerns about the system’s safety and effectiveness.
While Tesla CEO Elon Musk recently unveiled plans for a self-driving robotaxi called the Cybercab, the outcome of the current FSD investigation could potentially delay its launch. The Cybercab, which is set to debut in 2027, represents Musk’s vision for a true self-driving car that can operate without human intervention.
Overall, the ongoing investigation into Tesla’s FSD system highlights the challenges and risks associated with autonomous driving technology. As regulators continue to scrutinize these systems, it is crucial for automakers to prioritize safety and ensure that their vehicles meet the highest standards of performance and reliability.