Despite its name, Tesla's Full Self-Driving technology is not actually fully self-driving. Both FSD and Autopilot require the constant supervision of the driver with eyes on the road and hands on the wheel; the car is designed to "nag" the driver if and when that supervision lags. If the driver repeatedly ignores this nag, the feature could be disabled permanently.
But The Street reported in June that there exists a configuration hidden within Tesla's software that doesn't have this nag. Nicknamed "Elon Mode," this iteration of self-driving was discovered by a hacker who then drove 600 miles with it enabled.
DON'T MISS: Tesla Hacker Cracks Code, Finds Secret 'Elon Mode' That May Show What Driving In The Future Will Look Like
The hacker's experiment with "Elon Mode" went viral, eventually catching the attention of the National Highway Traffic Safety Administration (NHTSA). The regulatory agency sent Tesla a letter and special order on July 26, demanding more details about "Elon Mode," including the number of cars that have access to it.
The special order additionally requested a step-by-step rundown of how to enable the configuration, as well as Tesla's "basis or purpose in installing the software in consumer vehicles."
The NHTSA posted the letter and special order on its website Aug. 29.
"NHTSA is concerned that this feature was introduced to consumer vehicles and, now that the existence of this feature is known to the public, more drivers may attempt to activate it," the letter reads. "The resulting relaxation of controls designed to ensure that the driver remain engaged in the dynamic driving task could lead to greater driver inattention and failure of the driver to properly supervise Autopilot."
More Tesla:
- Here's why the Tesla bears are very wrong, according to Wedbush analyst Dan Ives
- Elon Musk On Tesla's Stock "10X-ing" as Cathie Wood Has Predicted
- Elon Musk Has a 'Purity' Around Motivation For Game-Changing EV Deal
(TSLA) -) Tesla was given a deadline of Aug. 25 to comply with the order; the company complied with this deadline, but their response was granted confidential treatment by the NHTSA.
This latest probe comes in the midst of a series of ongoing investigations into Tesla and its FSD software. The NHTSA has been conducting a lengthy investigation into the safety of FSD for years; the agency began a separate preliminary investigation into loss of steering issues with the company's Model 3 and Model Y vehicles in August.
The California attorney general likewise opened an investigation into the safety of FSD in July.
Musk went live Aug. 25 to demonstrate the latest version of FSD, clearly showing at several points that his hands were not on the wheel, a violation of Tesla's own policies around using FSD.
"Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment," Tesla's website reads. "While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.