The parents of a Tesla driver, who was crushed to death in a horrifying accident, filed a lawsuit against the electric car-manufacturer. They also blamed its CEO Elon Musk for trumpeting misleading claims about the car’s self-driving features.
Genesis Giovanni Mendoza-Martinez, 31, tragically lost his life on February 18 2023, after his Model S rammed into a fire truck near San Francisco, according to a lawsuit filed by his parents Eduardo and Maria.
The deceased victim was behind the wheel and suffered fatal injuries while his brother, Caleb, survived the incident with non-life-threatening injuries. Four firefighters also sustained minor injuries as a result of the collision.
“The time is coming due for Tesla to be held to account,” attorney Brett Schreiber, who is representing the Mendoza family, told Bored Panda.
Tesla and its CEO Elon Musk are being blamed in a lawsuit for the death of a 31-year-old man
The family members of Genesis are currently suing Tesla and pointing fingers at Elon for misleading claims about the car’s self-driving technology. They believe the vehicles are not ready for the road, contrary to the company’s bold declarations.
Genesis’ parents said their son was under the impression that the car could drive itself and was using the ‘Autopilot’ mode when the crash took place.
“Not only was he aware that the technology itself was called ‘Autopilot,’ he saw, heard, and/or read many of Tesla or Musk’s deceptive claims on Twitter, Tesla’s official blog, or in the news media,” read the complaint.
“Giovanni believed those claims were true, and thus believed the ‘Autopilot’ feature with the ‘full self driving’ upgrade was safer than a human driver, and could be trusted to safely navigate public highways autonomously.”
Genesis Giovanni Mendoza Martinez was killed in a crash involving his Tesla Model S and a fire truck near San Francisco on February 18, 2023
“Based on representations Giovanni heard made by Musk, Giovanni believed the vehicle was a safer driver than a human and relied on it to perceive and react to traffic in front of him,” the complaint added.
The emergency truck involved in the crash had arrived on the freeway in response to a previous accident. The vehicle had its lights on and was parked diagonally when Genesis rammed into it.
According to the lawsuit, Tesla’s Autopilot misinterpreted the emergency firetrucks and police cruisers at the scene; they appeared as “single frames in the vision system that were either very dark or very bright,” thus rendering the technology incapable of reacting appropriately.
The family claimed this failure reflects the flaws in the technology—flaws Tesla allegedly knew about but failed to address.
The National Highway Traffic Safety Administration (NHTSA) has been investigating 16 crashes involving Teslas in Autopilot mode colliding with emergency vehicles over the last six years. These 16 crashes have resulted in at least 15 injuries and one death, according to CBS News.
The deceased crash victim’s parents filed a lawsuit against Tesla and its CEO, accusing them of touting misleading claims about the car’s self-driving capabilities
The accident that claimed Genesis’ life raised critical questions about the reliability of autonomous systems in real-world scenarios.
“In principle automatic controls will always do a better job than a human operator because they don’t get tired or distracted, have very fast reactions and have as many eyes and other sensors as you wish,” Dr. William told Bored Panda via email.
“With all technologies, standards have to be developed and products will be tested for compliance. This is mature in the aircraft industry, but at an early stage in the automotive industry. It is unfortunate, but inevitable, that accidents will happen, but it is important to learn from them,” added the professor from the University Of British Columbia’s department of Electrical and Computer Engineering.
Attorney Brett Schreiber said Genesis’ death and the injuries sustained by the victims could have been prevented.
“Tesla knew that this generation of auto pilot technology could not decipher emergency vehicles’ flashing lights. Rather than taking the responsible step of actually recalling these vehicles Tesla simply pushed an over the air download that left thousands of vehicles vulnerable to the same defect,” he told Bored Panda. “That is how Tesla set the stage for Genesis Mendoza’s preventable death and the injuries to several innocent first responders.”
Tesla claimed that the crash might have been caused “in whole or in part” by the driver’s “own negligent acts and/or omissions”
“Like so many, Mr. Mendoza believed the misrepresentations, half truths and lies of Tesla about what its auto pilot technology could do,” he added. “Sadly, he suffered the ultimate price, his brother Caleb was seriously injured and their parents suffered a loss no one ever should. The time is coming due for Tesla to be held to account.”
Tesla, on the other hand, has maintained that their vehicles have “a reasonably safe design as measured by the appropriate test under the applicable state law.”
“[N]o additional warnings would have, or could have prevented the alleged incident, the injuries, losses and damages alleged,” the company said in response to the family’s lawsuit.
They argued that the “damages” and “injuries” suffered by the two brothers, “if any, were caused by misuse or improper maintenance of the subject product in a manner not reasonably foreseeable to Tesla.”
The family claimed Elon’s statements about the Autopilot and self-driving features led Genesis to trust the car could drive itself safely
In the Mendoza family’s lawsuit, they also accuse Tesla of knowing that their cars couldn’t live up to the hype created by Elon, who said in 2014: “I’m confident that — in less than a year — you’ll be able to go from highway onramp to highway exit without touching any controls.”
He also claimed in 2016 that the Autopilot feature was “probably better” than a human driver.
The family further accused Tesla of undertaking “a widespread campaign to conceal thousands of consumer reports about problems with [its] ‘Autopilot’ feature, including crashes, unintended braking, and unintended acceleration.”
The lawsuit stated that Tesla forced customers to sign nondisclosure agreements to receive repairs under warranty.
The company received “thousands of customer reports regarding problems with Tesla’s ‘Autopilot’ system between 2015 and 2022, including over 1,000 crashes; over 1,500 complaints about sudden, unintentional braking; and 2,400 complaints about sudden acceleration,” the complaint stated.
NHTSA has been investigating 16 similar crashes involving Teslas on Autopilot and emergency vehicles, resulting in 15 injuries and one death over six years
As the demand for electric vehicles increases across the globe, Dr. William asserted the important of having drivers trained in emerging technologies.
“One problem with driver assistance technology is that the user is sometimes not trained in how to use it. I believe that even with old fashioned cruise control some people have assumed that steering as well as speed is controlled,” he told Bored Panda.
“Certainly systems that need some human input should try to check that the human is not doing something inappropriate like being asleep. Some cars do this by analysing the steering,” he added. “In trains the driver is typically required to respond to a stimulus every few minute.”
“Some years ago I saw a presentation on automatic driving. A worst case example was given where a child ran after a ball from one side at the same time as someone lost control of a wheelchair on the other,” he went on to say. “It was inevitable that someone would die and the question was what the appropriate reaction should be. Certainly the automatic system would react faster than a human operator.”