NHTSA Escalates Probe into Tesla Vision-Based ADAS Performance
The NHTSA has intensified its probe into Tesla’s 'Full Self-Driving' software following reports of failures in low-visibility conditions. The investigation focuses on how the system manages environmental hazards without lidar.
Tesla's reliance on a vision-only approach for its Advanced Driver Assistance Systems (ADAS) is facing its toughest regulatory hurdle to date. The National Highway Traffic Safety Administration (NHTSA) has officially upgraded its investigation into Tesla’s "Full Self-Driving (Supervised)" software after discovering a pattern of performance issues during low-visibility scenarios, such as heavy fog, rain, or sun glare.
The probe is particularly focused on how the software classifies and reacts to obstacles when its primary cameras are compromised. Unlike most other players in the autonomous space who utilize a redundant "sensor fusion" of lidar, radar, and cameras, Tesla’s "Vision" system relies purely on neural networks processing optical data. Federal investigators are concerned that the system may not be providing adequate warnings to drivers or may be making incorrect steering and braking decisions when environmental conditions degrade.
This intensified scrutiny comes at a critical time for Tesla, as it continues to push FSD as its primary technological differentiator. If the NHTSA finds the software lacks necessary safeguards for adverse weather, it could lead to a massive mandatory recall or forced hardware retrofits—a move that would challenge the fundamental architecture of Tesla’s Autopilot hardware. For the ADAS industry, the outcome of this probe will set a precedent for the safety requirements of vision-centric autonomous systems.
Source: TechCrunch