Industry9 min readCNBC Technology

Tesla faces intensifying NHTSA probe of 'Full Self-Driving' in reduced visibility

P
Redakcja Pixelift3 views
Share

The American NHTSA agency is conducting an increasingly intensive investigation into Tesla's Full Self-Driving system, particularly in conditions of limited visibility. The examination covers all popular models from the manufacturer — Model S, X, 3, Y, and Cybertruck — which can use the FSD driving assistant. Regulators are focusing on the system's safety during driving in rain, snow, fog, or poor lighting conditions. This is a key point — Tesla promotes FSD as an advanced autonomous system, yet in reality it requires constant driver attention, which may lead to a false sense of security. For Tesla owners, this means potential limitations on FSD functionality or requirements for additional safety measures. For the company itself, it threatens not only financial penalties but also undermining the credibility of its flagship product in the eyes of consumers and competitors. The NHTSA investigation is a signal that the agency takes incidents related to autonomous driving assistance systems seriously, regardless of the manufacturer's reputation.

The U.S. National Highway Traffic Safety Administration (NHTSA) is intensifying its investigation into Tesla's Full Self-Driving (FSD) system, specifically its behavior in conditions of reduced visibility. The case concerns five models: Model S, Model X, Model 3, Model Y, and the newer Cybertruck. This is not a routine regulatory procedure — it is potentially a groundbreaking moment in the discussion about the safety of autonomous driver assistance systems, which could have far-reaching consequences for the entire electric vehicle industry.

The NHTSA investigation represents a significant turning point in the evaluation of technology that Elon Musk positions as approaching full autonomy. While Tesla consistently claims that FSD is merely an advanced driver assistance system requiring constant operator attention, the reality of usage shows that many drivers treat the system much more liberally. The fact that NHTSA decided to deepen its investigation precisely in the context of reduced visibility — conditions in which vision systems traditionally have the greatest problems — suggests that the agency has specific concerns based on real incidents.

Why reduced visibility is a key safety issue

Tesla's vision systems, on which FSD is based, are vulnerable to specific atmospheric and lighting conditions. Fog, heavy rain, snow, and even bright sunlight causing reflections on the windshield — all of this can drastically reduce the effectiveness of cameras, which are the only source of data for the system. Unlike traditional driver assistance systems, which often use radars and ultrasonic sensors as redundancy, Tesla relied primarily on computer vision.

Historically, reduced visibility has been responsible for a significant portion of traffic accidents. According to NHTSA statistics, conditions such as fog or nighttime driving without lighting are associated with higher collision risk. If the FSD system does not perform properly in such conditions, the potential danger is serious. Particularly problematic could be situations where the system behaves unpredictably — for example, sudden braking, lane changes without warning, or continued driving despite low confidence levels.

Tesla claims that its system is able to handle various atmospheric conditions thanks to advanced image processing and data redundancy from multiple cameras. However, practice shows that sometimes the system automatically deactivates in very difficult conditions, leaving the driver without support at the moment when it is most needed. This is precisely what could be the subject of NHTSA's investigation — not so much the ability to drive in bad conditions, but the unpredictability and lack of transparency about when the system actually works.

Scope of the investigation and models under review

The fact that NHTSA covers all major Tesla models available on the market in its investigation indicates a systematic approach to the problem. Model S and Model X are premium vehicles with the longest FSD history. Model 3 and Model Y represent the majority of Tesla's sales and are the most frequently encountered on roads. Cybertruck, as the newest model, has access to the latest versions of algorithms, but also represents an unknown variable — a vehicle with completely different geometry and camera configuration.

The inclusion of all these models in a single investigation suggests that the problem is not limited to a specific hardware variant or software. This means that NHTSA suspects a fundamental problem in how Tesla designs and implements driver assistance systems in conditions of reduced visibility. If the problem concerns the architecture of FSD itself at the algorithm level, the consequences could be significant.

The history of automotive safety regulations shows that when an agency like NHTSA begins a formal investigation covering an entire product line, it usually ends in one of three scenarios: a vehicle recall order, an obligation to update software, or — in rare cases — financial settlements. Tesla has already dealt with several update mandates, but a recall of the entire line would be unprecedented for the company.

Previous incidents and user complaints

NHTSA investigations never appear without reason. They are usually preceded by a wave of complaints, reports of accidents, or internal analyses pointing to a potential pattern of problems. In the case of FSD and reduced visibility, it can be assumed that the agency has received reports of cases where the system behaved unpredictably in foggy conditions, during heavy rain, or at night.

The Tesla user community on Reddit and other social media platforms has repeatedly reported problems with FSD in difficult atmospheric conditions. Drivers described situations where the system began to "become confused," making strange decisions — such as sudden lane changes without apparent reason, braking without cause, or even accelerating when it should slow down. While Tesla consistently claims that such incidents are rare and always result from user error (driver inattention), NHTSA has clearly decided to investigate this independently.

It is also worth remembering that Tesla has a long history of conflicts with regulators regarding the marketing of FSD. The agency has repeatedly instructed the company to change the way it promotes the system — for example, it stopped using the term "autopilot" in a way that would suggest full autonomy. The current investigation may be NHTSA's attempt to finally get concrete answers about the actual capabilities and limitations of the system.

Technical perspective: computer vision and sensor redundancy

Tesla's approach to autonomy is based on the assumption that a pure vision system is sufficient for safe driving. This position differs from the approach of competitors such as Waymo or Cruise, which use a combination of cameras, radars, lidars, and other sensors. Each of these devices has its strengths and weaknesses, and redundancy provides safety — if one system fails, others can take over.

Cameras are excellent in good visibility, but in conditions of fog, heavy rain, or snow, their effectiveness drops drastically. Radar, on the other hand, handles such conditions well, but has limitations in recognizing details. Lidar works based on laser light and can also be hampered by particles in the air, but usually better than a camera. Tesla, by abandoning radars in newer models, has placed all its bets on vision and software — which is technologically ambitious but potentially risky.

If NHTSA determines that Tesla's sensor architecture is insufficient in conditions of reduced visibility, it could lead to orders for Tesla to restore radars or install additional sensors. This would be costly for the company and would undermine one of its key theses about technological advantage. On the other hand, if the problem lies in the software rather than the hardware, an update could resolve the matter — though this is always more complex than it seems.

Implications for Polish users and the market

Poland has an increasing number of Tesla users, particularly in major cities. Model 3 and Model Y are popular among electric vehicle enthusiasts. The atmospheric conditions in Poland — especially autumn fogs, winter snowfall, and rain — are exactly what NHTSA is investigating. If the FSD system has problems in such conditions, Polish drivers could be particularly at risk.

Additionally, Polish road infrastructure differs from American infrastructure — many roads do not have clear lane markings, signaling is inconsistent, and weather conditions are more variable. A system trained primarily on data from North America may have even greater problems on Polish roads. If NHTSA imposes restrictions on FSD, Tesla may be forced to take similar action in European markets, including Poland.

Possible scenarios and consequences

There are several scenarios for how this investigation could end. The first scenario is a finding that the problem is marginal and that Tesla adequately warns users. In this case, the investigation could end without major consequences, though NHTSA would likely require further improvements. The second scenario is ordering Tesla to make software changes — for example, more restrictive conditions for activating FSD in poor visibility or better warnings for the driver. This would be both feasible for Tesla to implement and effective for safety.

The third scenario, more drastic, is ordering hardware changes — restoring radars or installing additional sensors. This would be costly and require changes in production. The fourth scenario is the withdrawal of FSD from sale or its significant limitation — for example, availability only under certain atmospheric conditions. This would be a public relations disaster for Tesla, but it is not impossible if NHTSA determines that the system poses an unacceptable safety risk.

The history of automotive regulations shows that agencies like NHTSA prefer to work with manufacturers on solutions rather than impose drastic sanctions. However, if Tesla is resistant or claims that the problem does not exist, more drastic steps are possible. Given the history of conflicts between Tesla and NHTSA, tensions between the parties could be significant.

Broader perspective: the future of autonomous driver assistance systems

This investigation is a symptom of broader tensions in the automotive industry. Regulators around the world are trying to find a balance between supporting innovation and protecting public safety. Tesla has spent years operating on the edge of this balance, aggressively promoting FSD as the future of transportation, while technically maintaining that the driver must always be ready to take control.

Other companies, such as Waymo, take a more cautious approach — developing full autonomy, but only in geographically limited areas and under strict control. This approach is less spectacular in the media, but potentially safer. If NHTSA penalizes Tesla for too aggressive an approach to FSD, it could change the dynamics of the entire industry, forcing other companies to adopt more conservative strategies — or conversely, giving them a competitive advantage.

The stakes of this investigation are high not only for Tesla, but for the entire future of autonomous vehicles. If regulators determine that systems like FSD are too dangerous in their current form, it could delay the deployment of autonomy by years. On the other hand, if Tesla can solve visibility problems and NHTSA gives the green light, it could open the door for more advanced systems across the industry. The outcome of this investigation will matter for every driver, every automaker, and every regulator concerned with road safety.

Comments

Loading...