Connect
To Top

Tesla’s FSD Software Under Investigation After Railroad Incident

Federal regulators have begun a sweeping review of Tesla’s Full Self-Driving software. The National Highway Traffic Safety Administration (NHTSA) announced a formal investigation covering about 2.9 million Teslas, following reports that the technology may mishandle critical traffic scenarios such as intersections and railroad crossings.

In its announcement, the agency cited reports of Teslas allegedly running red lights, veering into oncoming lanes, and failing to stop when trains approached. The goal is to evaluate whether the software responds safely and alerts drivers properly when manual intervention is needed.

Safety Concerns Spark Attention

Tesla car towing at railway crossing

Instagram | @cambobiz | Tesla faces national concern after multiple self-driving incidents at railroad crossings.

Concerns spiked after an NBC News investigation surfaced videos showing Tesla vehicles continuing through active railroad crossings. In several clips, flashing lights and closing gates appeared to trigger no reaction from the cars — raising urgent questions about FSD’s reliability. Following the report, two U.S. senators called for immediate regulatory review.

NHTSA confirmed that it identified at least 18 complaints and one media report describing Tesla vehicles that did not stop for red lights or accurately display traffic signal information.

In six cases, the cars were involved in collisions at intersections, and four of those crashes resulted in injuries. The agency noted that several incidents occurred at the same Maryland intersection, where Tesla reportedly made adjustments to address the problem.

Tesla’s Response and Ongoing Updates

Neither Tesla nor its CEO, Elon Musk, has publicly commented on the new investigation. At the same time, Tesla’s official driver manual continues to clarify that Full Self-Driving does not make its vehicles fully autonomous.

The company stresses that drivers must remain alert and ready to take control at any time. In fact, Tesla even modified the product’s name in some updates to include the term “Supervised,” reinforcing that responsibility lies with the driver.

Tesla dashboard with autopilot engaged

Instagram | @craigzsf | Regulators examine Tesla FSD software for safety issues at intersections and crossings.

Despite repeated cautions, Elon Musk continues to promote Tesla’s Full Self-Driving (FSD) as a leap toward true autonomy. In August, he went so far as to claim on social media that Tesla cars “can drive themselves,” a statement experts say remains far from verified. Musk has also emphasized FSD’s role in Tesla’s grand vision — powering a network of fully driverless robotaxis in the future.

Expanding the Scope of the Inquiry

The NHTSA is widening its investigation to determine how well Tesla’s system interprets critical driving cues such as traffic lights, railroad signals, and lane markers. Regulators say the most concerning behavior occurs at intersections, though the review will cover other driving environments as well. The goal is to find out whether Tesla’s software creates potential safety risks or fails to engage drivers when human intervention is necessary.

The investigation remains in its preliminary phase. Depending on what regulators find, outcomes could range from no further action to a mandated software update — or, in a more serious case, a recall. Tesla recently introduced version 14 of FSD, but its reliability under scrutiny remains uncertain.

Potential Impact on Tesla

The review could have serious implications for Tesla’s reputation and its push toward autonomous driving. Regulators are under mounting pressure to hold automakers accountable, and confirmed deficiencies could lead to tougher oversight and costly revisions.

For now, Tesla owners are urged to keep their hands on the wheel and treat FSD as an assistive tool, not full automation — a reminder that innovation still comes with responsibility.

More inDriving

You must be logged in to post a comment Login