Get all your news in one place.
100’s of premium titles.
One app.
Start reading
InsideEVs
InsideEVs
Technology

Tesla 'Full-Self Driving' Still Drives On The Wrong Side Of The Road

Tesla CEO Elon Musk described the new "Full Self-Driving" Beta version 12.4 as a "5-10x improvement" in miles per intervention compared to the previous version, which he described with similar bombast. And yet, in a video that is largely positive about FSD, from a channel that believes in Elon's AI vision, a Tesla with the inaccurately named software drives the wrong way in traffic.

That's the state of FSD in 2024. The technology is still not legally self-driving. It is still 100% on the driver if something happens. It is still unable to make good on promises Musk made in 2016, like the claim that it would be able to drive anywhere in the country with no one inside by 2018. By the end of 2020, there were supposed to be 1 million robotaxis on the road. By 2024 there were actually zero.

And despite countless "order of magnitude" improvements, the camera-based software cannot consistently avoid driving on the wrong side of the road. Despite all of this, though, there are undeniable magic moments in the FSD test videos, where the car demonstrates impressive intelligence. 

Get Fully Charged

FSD's Long Road

When Elon Musk began offering a "Full Self Driving" package for pre-order in 2016, he suggested that full autonomy was just around the corner. Eight years later, we have seen dozens of "FSD" releases, none of which have made the car legally self-driving. FSD's basic driving skills have improved considerably, but the software still makes potentially dangerous, surprising mistakes. 

In the latest video by AI DRIVR, for instance, FSD is able to navigate around a crashed car, driving slightly off the road. That's a situation that could easily flummox a hard-coded, rules-based system. Tesla's "AI" system is able to do what you'd normally do in that situation: Follow the car in front. But while AI DRIVR credits that as intelligence, it's also the most obvious possible solution: Follow the car in front. What happens if the Tesla is at the front? I'd wager it's either bricked or stuck waiting on humans.

That generous commentary appears throughout the video and represents a fundamental challenge in evaluating FSD driver footage. In one section, the narrator says the Tesla "anticipates the flow of traffic a little too well." What he's describing is the Tesla assuming the car in front of it will move with traffic, and dangerously tailgating it. He also uncritically repeats Musk's claim that this version was trained too much on interventions, which explains some of its aggression and lack of smoothness. There's no questioning whether releasing a version that is clearly erratic and aggressive onto public roads, at the hands of untrained, non-expert owner-testers is a good idea.

But the fact that those non-expert owner-testers are hand-picked is the real problem. Musk has a habit of only giving access to the ultra-hyped, early-access tech to his most devoted fans. Anyone who criticizes or questions too much is bound to lose their elevated status. Combine that with the fact that Tesla owners who pay for FSD, in general, must be believers, and you have a sample size of people parroting meaningless phrases like "ChatGPT moment" and "DriveGPT" as if the fundamental underpinnings of large language models and image-processing decision-making drive systems are even related beyond the buzzwords. 

I myself, however, have not found a perfect way to talk about FSD. On one hand, it is far more sophisticated than I ever thought it would be. Its ability to drive reasonably well, reasonably often using relatively modest processing hardware and the world's worst sensor suite is impressive. As someone who's been sounding alarm bells about its misleading name and potential danger since 2017 while working for CNBC—the consequences of this questionably supervised public testing have so far been less severe than I anticipated.

But the video captures the fundamental issue with a trial-and-error system based only on recorded issues. As AI DRIVR describes the unbearable cautiousness of 12.4.1, followed by the seeming quality of 12.4.2, then the aggression of 12.4.3, followed by the vague promise of future improvement with "more data," it just feels too familiar. Tesla announces an update with an unfalsifiable, absurd claim about "a giant leap" forward.

Reviewers film it slamming into curbs or driving the wrong way down public roads with pedestrians nearby, and say they are impressed by its promise, if only Tesla could iron out the "edge cases." A new update drops, and the last issues are fixed. But seven new issues crop up, which will be gently noted in videos from fully bought-in "reviewers," and the cycle repeats.

Maybe I'm short-sighted, a doubter. Maybe I—and all of the experts who have told me that a camera-based system on Tesla's current platforms will never be autonomous—are the exact skeptics that the hero will vanquish in the end. But even still, after eight years of asking people thousands of dollars for a "Full Self-Driving" package, I would hope we'd have a version that never drives the wrong way down a public road.

Maybe it'll come in the next update. I heard it's a 10x improvement. 

Contact the author: mack.hogan@insideevs.com

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.