The family of a California man who died in a horrific crash while his Tesla was in “Autopilot” mode is suing the electric carmaker over claims by the company and CEO Elon Musk that its self-driving technology had been perfected and was ready for the road.
Genesis Giovanni Mendoza Martinez, 31, was crushed to death behind the wheel of the Model S he bought under the — apparently mistaken — belief it could drive itself, according to a lawsuit filed by Mendoza’s parents, Eduardo and Maria, and his brother Caleb, who was also severely injured in the February 18, 2023 wreck.
Tesla, for its part, argues that its cars have “a reasonably safe design as measured by the appropriate test under the applicable state law,” and that the accident “may have been caused in whole or in part” by Giovanni Mendoza’s “own negligent acts and/or omissions.”
“[N]o additional warnings would have, or could have prevented the alleged incident, the injuries, losses and damages alleged,” the company responded in a court filing rebutting the family’s claims.
The US government has blasted Tesla for overstating claims about its cars’ self-driving abilities, with Transportation Secretary Pete Buttigieg among the harshest critics.
Attorney Brett Schreiber, who is representing the Mendoza family, told The Independent, “This is yet another example of Tesla using our public roadways to perform research and development of its autonomous driving technology. The injuries suffered by the first responders and the death of Mr. Mendoza were entirely preventable. What’s worse is that Tesla knows that many of its earlier model vehicles continue to drive our roadways today with this same defect putting first responders and the public at risk.”
Schreiber said Tesla puts cars on the road with an Autopilot feature he described as “ill-equipped to perform,” and that instead of announcing a recall to correct problems, the company simply releases new software and calls it an “update.”
“It’s this rush of pushing product out that is not really ready for primetime,” Schreiber said.
Tesla officials did not respond to a request for comment on Sunday. Messages sent to the legal team working on the carmaker’s defense in the Mendoza case also went unanswered.
In their complaint, which was removed from state court to federal court this week, the Mendoza family says Giovanni, who worked at a bank, was one of “many members of the public” persuaded by public statements and online posts by Musk, along with Tesla’s extensive advertising efforts, that its cars were capable of driving themselves.
“Not only was he aware that the technology itself was called ‘Autopilot,’ he saw, heard, and/or read many of Tesla or Musk’s deceptive claims on Twitter, Tesla’s official blog, or in the news media,” the complaint states. “Giovanni believed those claims were true, and thus believed the ‘Autopilot’ feature with the ‘full self driving’ upgrade was safer than a human driver, and could be trusted to safely navigate public highways autonomously.”
Taking the world’s richest man and Tesla at their word that its cars could operate autonomously, Giovanni purchased a used Model S in 2021, and regularly drove it on the freeway using the Autopilot feature, the Mendozas’ complaint states.
“Based on representations Giovanni heard made by Musk, Giovanni believed the vehicle was a safer driver than a human and relied on it to perceive and react to traffic in front of him,” the complaint goes on.
Shortly after Valentine’s Day last year, at around 4 a.m., Giovanni was driving his Tesla northbound on Interstate 680, with Caleb in the passenger seat and the Autopilot engaged, according to the complaint.
In the distance, a fire truck was parked diagonally across two lanes of traffic, with its emergency lights flashing, to divert oncoming cars away from a collision site, the complaint continues. It says a second fire truck was also on the scene, along with two California Highway Patrol vehicles, all of which also had their emergency lights activated.
Slow down and move over when approaching emergency vehicles. Truck 1 was struck by a Tesla while blocking I-680 lanes from a previous accident. Driver pronounced dead on-scene; passenger was extricated & transported to hospital. Four firefighters also transported for evaluation. pic.twitter.com/YCGn8We1bK
— Con Fire PIO (@ContraCostaFire) February 18, 2023
As the brothers made their way down the road, the vehicle suddenly broadsided the first fire truck, slamming into it at high speed, the complaint states.
“At the time of the collision, Giovanni was not controlling the Subject Vehicle, but he was instead passively sitting in the driver’s seat with the ‘Autopilot’ feature engaged,” the complaint continues. “In fact, data from the Tesla itself showed that the Subject Vehicle was in ‘Autopilot’ for approximately 12 minutes prior to the crash, with no accelerator pedal or brake pedal inputs from Giovanni during that time. The approximate speed of the Subject Vehicle was 71 mph during the 12-minute period.”
The data further showed that Giovanni “generally maintained contact with the steering wheel until the time of the crash,” according to the complaint.
“As a result of the collision, the Subject Vehicle sustained major frontal damage, crushing Giovanni’s body,” it says. “Giovanni survived, at least momentarily, but subsequently died from the injuries he sustained in the collision.”
The complaint argues that the Tesla Autopilot system is flawed, and unable to discern the emergency vehicles from regular traffic, even with their emergency lights on. Instead, it alleges, the Autopilot saw “single frames in the vision system that were either very dark or very bright,” and missed the fire trucks and police cruisers altogether, causing the crash, killing Giovanni and severely injuring Caleb. Four firefighters suffered minor injuries, according to contemporaneous news reports.
The complaint lists page after page of other Tesla crashes involving Autopilot, and accuses Tesla, and Musk specifically, of neglecting to iron out all existing bugs before releasing the feature to the public. It calls out dozens of statements, claims, and online posts by Musk himself as wholly misleading, while knowing Teslas were not capable of actually driving autonomously.
In one example, the complaint highlights a comment by Musk during a June 2014 shareholder meeting, in which he said, “I’m confident that — in less than a year — you’ll be able to go from highway onramp to highway exit without touching any controls.”
During a January 2016 conference call with reporters, Musk said Tesla’s Autopilot feature was “probably better” than a human driver, and that within two years, drivers would be able to remotely summon their Teslas from afar, and have them show up anywhere, according to the complaint.
Musk also “misleadingly suggest[ed]” in a statement posted to the Tesla blog that the major challenge in rolling out fully self-driving vehicles was simply a “lack of regulatory approval,” the complaint states. It says he appeared at a 2019 event in Palo Alto called “Autonomy Day,” and promised “over a million robo-taxis on the road,” without any steering wheels or pedals, by the end of 2019. In 2020, Musk claimed full autonomy would be released across the Tesla product range the following year.
But, the complaint alleges, Tesla knew all the while that its cars couldn’t live up to the hype, and “undertook a widespread campaign to conceal thousands of consumer reports about problems with [its] ‘Autopilot’ feature, including crashes, unintended braking, and unintended acceleration.” It accuses Tesla higher-ups of training front-line employees to “refrain from memorializing customer reports in writing.”
“When Tesla employees did respond to customer reports in writing, it was only to reassure customers that the ‘Autopilot’ feature was working as intended,” the complaint states. “In addition, Tesla… forced consumers to sign nondisclosure agreements to receive repairs under warranty,” which the complaint says is a violation of the California civil code.
All told, Tesla received “thousands of customer reports regarding problems with Tesla’s ‘Autopilot’ system between 2015 and 2022, including over 1,000 crashes; over 1,500 complaints about sudden, unintentional braking; and 2,400 complaints about sudden acceleration,” the complaint contends. At the same time, thousands of drivers have talen Tesla at its word that its cars are able to drive on their own, when “in fact [they are] incapable of safely handling a variety of routine roadway scenarios without driver input.”
“Predictably, this has led — and will continue to lead — to multiple collisions between Teslas and other vehicles or pedestrians, resulting in death or serious bodily injury,” the complaint says.
In a response filed in US District Court for the Northern District of California, Tesla argued that the “damages and injuries” suffered by Giovanni Mendoza and his brother, “if any, were caused by misuse or improper maintenance of the subject product in a manner not reasonably foreseeable to Tesla.”