
- Jonathan Challinger claims his Cybertruck drove headlong into a streetlight while in Full Self-Driving mode. Elon Musk wants the technology to be ready for a June robotaxi launch.
Wrapped around a pole with its right wheel dangling, the image of a wrecked Tesla Cybertruck lying motionless on the side of the road is shocking. Driver Jonathan Challinger posted the undated picture on Sunday. He claims Tesla’s automated Full Self-Driving (FSD) software caused his vehicle to crash into a light post while he wasn’t looking.
While Challinger escaped without harm, he warned others might not be so lucky. “Spread my message and help save others from the same fate or far worse,” he wrote.
The post received 2 million views, sparking fierce debate as to whether FSD is good enough to be used without humans behind the wheel.
It comes less than five months before Tesla CEO Elon Musk’s crucial launch of an autonomous driving robotaxi service, which is a core pillar supporting Tesla’s more than $1.1 trillion market cap.
According to Challinger’s account, the car failed to depart a lane that was ending, despite no vehicles that might have impeded him merging into another, and making no attempt to slow down or turn until it was too late.
Google Maps and Street View imagery show that the road layout matches the photo in Challinger’s post. An official for the Reno Police Department confirmed to Fortune that there was a crash involving a driver named Challinger on Feb. 6, but declined to give further details pending a full report being filed.
Challinger tagged Musk, AI director Ashok Elluswamy, Tesla’s entire AI team, and Cybertruck lead engineer Wes Morrill in the tweet.
The carmaker constantly collects data from FSD for training. In the past it has immediately denied crash accounts when they have not been true.
At the time of publication, Tesla had not responded to Fortune’s request for comment. Fortune also contacted Challinger for comment but did not receive a reply.
‘Big fail on my part. Don’t make the same mistake I did’
Tesla only rolled out FSD to the Cybertruck in September, a full 10 months after the vehicle launched.
The pickup has larger dimensions, a higher stance on the road, and more complex engineering—it uses all four wheels to steer—than a Tesla saloon.
One of the best-known and most impartial Tesla FSD testers attested to the plausibility of Challinger’s account of the crash.
“The situation you describe is very common, where the planner or decision-making to get in the appropriate lane early enough often gets the vehicle in a bind, where it runs out of options,” replied Chuck Cook, who was tagged in the post by Challinger. “You are NOT the only one.”
Challinger was quick to admit negligence and accept ultimate responsibility for failing to supervise the system, as Tesla requires of all its owners who use FSD.
“Big fail on my part, obviously. Don’t make the same mistake I did. Pay attention. It can happen,” he warned, requesting a means to deliver dashcam footage to Tesla’s AI team for analysis.
Soooooo my @Tesla @cybertruck crashed into a curb and then a light post on v13.2.4.
— Jonathan Challinger (@MrChallinger) February 9, 2025
Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
It failed to merge out of a lane that was ending (there was no one on my left) and made… pic.twitter.com/vpT4AGz8jZ
Challinger, moreover, dismissed accusations that he was acting in bad faith by trying to capitalize on the scrutiny surrounding Musk, Tesla, and FSD ahead of the commercial launch in June.
“I just want to get the data to Tesla if I can. I tried everything I could think of to get in touch with them,” he said.
Doubt over date of reported crash
Challinger had previously acknowledged early last month—in a separate post he didn’t flag widely—that he had been involved in a serious accident.
You are NOT the only one, but it is becoming much less frequent as the software improves. Not many people are as brave as you have been to admit their (and FSD's) mistakes.
— Chuck Cook (@chazman) February 9, 2025
The situation you describe is very common, where the planner or decision making to get in the…
Responding to a question about the Cybertruck’s structural ability to absorb energy in a frontal collision, he wrote in early January: “Having crashed mine, can confirm that it crumples just fine.”
He said he had repeatedly tried to get the dashcam footage to Tesla.
Challinger specified that the crash occurred while using FSD v13.2.4, a software version that had only been widely rolled out to all FSD users roughly a week after his earlier post.
Musk’s track record raises doubts about the technology
Just months ahead of a planned June launch, CEO Musk has yet to publish any independently verifiable data to back up his claim that FSD is ready to be used in an unsupervised, fully autonomous robotaxi.
By comparison, other rivals like Waymo report their disengagements to state regulators. Tesla, however, has used a legal loophole to avoid this transparency for years.
Musk has also repeatedly misstated facts.
Tesla’s AI director, Elluswamy, testified in court that Musk ordered him to doctor a marketing video to mislead consumers about Tesla’s FSD capabilities.
More recently, Musk admitted Teslas running on older AI3 inference computers have, in fact, failed to live up to his claim that all cars built after 2016 are capable of autonomous driving.
He plans to replace that hardware with the newest generation in those vehicles where customers purchased FSD. How exactly that can be done and at what cost is unclear.