**Summary of Fox News Article: Federal Investigation into Tesla’s Full Self-Driving System**
The U.S. National Highway Traffic Safety Administration (NHTSA) has launched a major new investigation into Tesla’s “Full Self-Driving” (FSD) system, raising serious concerns about the safety and reliability of the technology. The probe, announced in June 2024, covers approximately 2.88 million Tesla vehicles equipped with FSD software. This comes amid mounting evidence that the system may not only be violating traffic laws but also contributing to dangerous and even injurious accidents on American roads.
### Mounting Reports of Dangerous Behavior
The investigation was prompted by a growing number of complaints and incident reports from Tesla drivers and other motorists. According to Reuters, the NHTSA has received at least 58 reports describing Teslas engaged in hazardous behaviors while FSD was enabled. These include running red lights, drifting into the wrong lanes, and failing to properly navigate intersections—actions that have led to real-world consequences.
Among these reported incidents, 14 resulted in actual crashes, and at least 23 people sustained injuries. In one particularly troubling pattern, six separate cases involved Teslas running red lights and subsequently colliding with other vehicles. One Houston driver described an incident in which their Tesla failed to recognize a red traffic signal: the car reportedly stopped at green lights but proceeded straight through reds, directly contradicting basic traffic rules. Despite this issue being witnessed by Tesla staff during a test drive, the company allegedly declined to address the fault at that time.
NHTSA investigators are also examining new complaints regarding FSD’s handling of railroad crossings. In at least one case, a Tesla reportedly came dangerously close to a collision with an oncoming train after failing to recognize and respond appropriately to the crossing.
### Tesla’s Troubled History with Regulators
This latest investigation is by no means Tesla’s first encounter with federal safety authorities. The company has been the subject of ongoing scrutiny over both its Autopilot and FSD systems. Notably, a recent California court case resulted in a $329 million judgment against Tesla after an Autopilot-related crash killed a woman. Separate investigations are also underway into Tesla’s limited Robotaxi service in Austin, Texas, where passengers reported erratic driving and speeding—problems that occurred even when human safety drivers were present in the vehicles.
Tesla is also facing legal challenges over its marketing of the FSD system. The California Department of Motor Vehicles has filed a lawsuit claiming that the name “Full Self-Driving” is misleading, as the system still requires drivers to pay close attention and intervene frequently. In response, Tesla recently rebranded the feature as “Full Self-Driving (Supervised),” acknowledging that the technology is not truly autonomous and must be closely monitored by the person behind the wheel.
### Recent Software Updates Under Scrutiny
Adding to the urgency of the federal probe, Tesla released an update to its FSD software just days before the NHTSA investigation was made public. Despite the update, regulators assert that the system continues to induce driving behaviors that violate traffic safety laws. The early findings suggest that the software’s decision-making may be fundamentally flawed in certain scenarios, such as recognizing traffic signals and handling complex intersections.
Should the NHTSA determine that the FSD system poses a significant risk to drivers, passengers, and the public, the agency has the authority to require Tesla to recall or modify the affected software. This would represent a significant setback for Tesla, which has positioned FSD as a cornerstone of its vision for the future of transportation.
### Safety for Tesla Owners and the Public
For the nearly three million Tesla owners with FSD enabled, the message from safety regulators is clear: remain vigilant. Despite the system’s name and extensive marketing, FSD does not make the vehicle fully autonomous. Tesla drivers are required to keep their hands on the wheel and their eyes on the road at all times, ready to take control if the software makes a mistake.
The broader public should also take note. The NHTSA investigation serves as a sobering reminder that so-called “self-driving” technology is, for now, a work in progress. While the push toward automation in the automotive industry continues at a rapid pace, the reality is that human supervision is still essential to ensure safe operation.
### The Larger Context: Automation and Accountability
Tesla’s pursuit of a fully autonomous future is emblematic of a wider industry trend. Major automakers—including General Motors
