Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Rob Lenihan

Exclusive video raises new concerns about Tesla's self-driving system

In September 2021, five Texas police officers filed a lawsuit against Tesla (TSLA) -).

The officers were suing over injuries they had sustained in an accident earlier that year when a 2019 Model X crashed into them at a traffic stop.

DON'T MISS: Stellantis unveils low-priced electric vehicles to rival Tesla

The driver was intoxicated and the car reportedly was in the Autopilot function at the time of the crash. 

The lawsuit charged that Tesla hadn’t done enough to address issues with the controversial system. 

“The officers want to hold Tesla accountable, and force Tesla to publicly acknowledge and immediately correct the known defects inherent in its Autopilot and collision avoidance systems, particularly as those impact the ongoing safety of our nation’s first responders,” the lawsuit states.

Tesla and CEO Elon Musk, the lawsuit said, "have repeatedly exaggerated the actual capabilities of Autopilot, resulting in the public, including first responders, and Tesla drivers being put in a significant risk of serious injury or death."

The automaker, the lawsuit said, "is engaging in systematic fraud to pump Tesla's share price and sell more cars, while hiding behind disclosures that tell the drivers that the system can't be relied upon."

The officers are seeking damages, citing multiple injuries and permanent disabilities, of up to $20 million.

Tesla denies the lawsuits allegations and claims the fault lies with the driver.

Failed to recognize emergency vehicles

Exclusive dash cam footage of the Texas crash was used in an Aug. 9 Wall Street Journal video report that indicates the Tesla's autopilot system failed to recognize the stopped emergency vehicles in time.

“The Tesla was completely unable to detect the existence of at least four vehicles, six people and a German shepherd fully stopped in the lane of traffic,” the lawsuit said.

The crash is one of 16 between Teslas and emergency vehicles being investigated by the National Highway Traffic Safety Administration.

The Journal said it had obtained 8 crash reports included in the NHTSA investigation, and at least six incidents occurred when emergency vehicle lights were flashing.

The dash cam footage from the crash, which happened in the Houston suburb of Splendora, shows the Tesla slamming into the back of an emergency vehicle. 

 While the car's driver monitoring system appears to have worked as designed, the report said it was not enough "to sideline the impaired driver and prevent the collision."

Tesla 's autopilot system partially automates many driving tasks on highways including steering, braking and lane changes.

Drivers using autopilot are supposed to remain engaged so they can take control of the car at any time. 

Federal investigators have said Tesla's marketing, including the name Autopilot, exaggerates its capabilities and encourages drivers to misuse the technology.

The Journal said experts in autonomous vehicle safety who reviewed the crash footage say there's a difference between the way the car's camera sees an ordinary vehicle and an emergency vehicle. 

Flashing lights create hazy image

The police cars flashing lights created a hazy image that the car software likely did not recognize as a stopped vehicle, the experts said.

Logs indicate that the car finally recognized something in its path just 2.5 seconds and 37 yards before the crash. The Autopilot attempts to slow the car and ultimately disengages, but at 54 mph it is too late. 

The officers are also suing a local restaurant owner, claiming the Tesla’s owner was served too much alcohol prior to the incident.

Tesla CEO Elon Musk has been talking about Tesla's Full Self-Driving technology for years, calling himself "the boy who cried FSD" during the company's second quarter earnings call.

In 2016, the CEO said that Tesla's driver-assist feature -- Autopilot -- will be able to drive better than a human in two to three years. He also said that by 2018, it would be possible to remotely summon a Tesla across the country.

In 2019, he said that Tesla could have a fleet of a million robotaxis by the end of 2020 if the company pumped out hundreds of thousands of FSD cars.

The FSD that Musk keeps predicting is just around the corner would be a self-driving car that does not require any driver attention at all.

The lawsuit cites several of Musk’s tweets commenting on crashes involving Autopilot, including one that described one of Tesla's driver assistance systems as "not great."

Get exclusive access to portfolio managers and their proven investing strategies with Real Money Pro. Get started now.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.