Tesla CEO Elon Musk is big on making bold predictions.
Seven years ago he said Tesla could one day have the same market capitalization as Apple, which was $750 billion at the time. Late last year Tesla topped $1 trillion in valuation, although its' pulled back sharply this year.
Musk has also made reusable rockets commonplace, and is even constructing the largest ones ever in his quest to let humans travel to Mars.
But not all of Musk's predictions come true when he says they will.
One of the biggest, fully self-driving automobiles, has been promised for years, but still hasn't gone into full release. And because it involves driving on public roads, Tesla's program of "beta" releases of self-driving features has sparked growing discomfort among regulators.
Now, even Tesla's partially autonomous driving feature, autopilot, which allows the cars to travel on freeways without driver intervention, is itself facing growing scrutiny.
In the latest indication of troubles for Tesla, the National Highway Traffic Safety Administration (NHTSA) announced that it was expanding its probe into Tesla (TSLA)'s autopilot feature to include as many as 830,000 Tesla Model Y, X, S, and 3 vehicles sold between 2014 and 2021.
That's a big expansion.
In February, NHTSA ordered the Austin-based EV maker to recall 54,000 vehicles over similar concerns. Such an investigation is frequently a prelude to an official recall. This wasn't the first recall, either. NHTSA ordered several recalls in 2021 over everything from loose front suspension to camera cable installations.
What Is Going On With Tesla's Autopilot?
The investigation expands on the one that was launched in February and elevates its status from a "preliminary evaluation" to an "engineering analysis."
The latter is a deeper probe, although the fact that one has been launched does not guarantee that a recall is coming.
The goal, NHTSA said in a statement to Reuters, is "to extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver's supervision."
In other words, NHTSA is trying to see what has been going on with the autopilot among a series of crashes.
The agency has identified 16 crashes between Tesla vehicles using the autopilot driving feature and parked emergency vehicles. The incidents resulted in one death and 15 injuries.
Separately, In a recent high-profile case, a 27-year-old Tesla Model S driver pleaded not guilty to vehicular manslaughter after a crash while in Tesla's autopilot mode killed two people.
Why Have There Been So Many Crashes Lately?
Tesla crashes, and NHTSA's probing in particular, have become a sore point for Musk, who complains often about regulatory actions that interfere with his projects. Musk argues that Teslas on balance have fewer and less serious crashes than would be expected otherwise.
"It's super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage," Musk wrote on Twitter in 2018.
But regulators have repeatedly called for more scrutiny of the autopilot feature in particular since its ability to drive the car on its own has been associated with some serious accidents.
That does not always mean it's the cause of Tesla itself; NHTSA's earlier investigations found that "indications existed that the driver was insufficiently responsive to the needs of the dynamic driving task" in approximately half of the cases being investigated.
"A driver's use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect," NHTSA said.