Tesla CEO Elon Musk said during June’s shareholder presentation that robotaxis would skyrocket the company’s valuation to over $5 trillion. That’s several times more than its current valuation of around $630 billion. But the technology required to make self-driving cars safe simply doesn’t exist yet.
Tesla says it’s getting there, but a recent video by YouTube channel Out Of Spec Videos revealed a system that’s far from being safe or trustworthy. The channel recently had the opportunity to review Tesla’s latest Full-Self Driving (Supervised) version 12.5 on a Model 3. However, the weather wasn't ideal, with constant showers and compromised visibility. It revealed FSD's vulnerabilities and the driver had to intervene several times.
Get Fully Charged
Tesla's bet on self-driving cars.
Tesla has pivoted from building electric cars for the people, towards a wild obsession for robotaxis, artificial intelligence and humanoid robots. The technology underpinning Tesla's Full-Self Driving (Supervised) cars seems to be improving, but there's still a long way to go.
Compared to the previous version, FSD 12.5 has five times more parameters. Parameters are basically the specific variables and settings in the algorithm that are used to perceive the environment and make active driving decisions. The latest version apparently also has support for driver monitoring with sunglasses, a combined software stack for highway driving and city driving and Actual Smart Summon (ASS), which is expected to navigate Teslas in and out of parking lots without the driver in the car.
What Out Of Spec Videos found was that FSD 12.5 still had plenty of room for improvement. Even though the Model 3 was more human-like in how it drove, it was often confused by different environments. For starters, it took a wrong turn out of the neighborhood, something the drivers said it struggles with. Then, while making a left turn onto a highway, it dangerously stopped in the middle of the road while waiting for an approaching car to pass. Instead, it could have simply stopped before the marked line.
The rain compromised the car’s vision, too. Tesla only relies on cameras and training its “neural networks” from real driving footage from millions of vehicles. Unlike Cruise or Waymo, there’s no supplemental radar or lidar on Teslas. In the middle of the drive, the system alerted the drivers that FSD was “degraded.” Shortly after that, the Model 3’s screen rendering showed a road-side shoulder, where it attempted to make a U-turn. There existed no shoulder in this imagined space, just a furniture store. It appeared as though the EV was trying to enter the store, even though that wasn’t the intention.
It also did some other dangerous things like overspeeding, getting into a turn lane and then continuing straight and driving close to the curbs, something that was solved in the previous versions, but it looks like that’s now a recurring issue. The Tesla Robotaxi will be revealed on October 10. But before any robotaxis hit the roads, Tesla has a mountain to climb when it comes to making FSD safe.