Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
James Ochoa

Tesla employee exposed a dangerous safety flaw in Autopilot

Tesla's Autopilot driver assistance system is one of the automakers' most well-known technologies, which promises drivers a hands-free driving experience in equipped vehicles.

However, Tesla's  (TSLA)  technology differs from most other autonomous driving systems, such as radar and LiDAR-based systems used by companies like Waymo. Tesla relies on a system of cameras that covers every corner of its events, as well as machine learning software that makes on-the-fly decisions on how to respond to hazards like road signs and parked cars.

Related: Park next to a crime? Police say your Tesla may be a star witness

Additionally, Tesla employs a team of researchers and programmers who continuously analyze what the Autopilot cameras see and adapt the software to respond to various conditions its vehicles encounter. 

However, while Tesla owners and other road users might expect that the engineers behind Autopilot were intent on keeping their promise to make safer roads, a new report reveals the opposite. 

A driver rides hands-free in a Tesla Model S equipped with Autopilot hardware and software in New York on Sept. 19, 2016. 

Bloomberg/Getty Images

No road rules at Tesla

According to a recent report from Business Insider, 17 current and former workers on the Tesla data annotation team, at offices in Buffalo, N.Y., Palo Alto, Calif., and Draper, Utah, revealed a whole host of critical and damning information related to their jobs, which primarily revolved around viewing 30-second clips recorded by the cameras that are in charge of keeping Autopilot functioning. 

Tasked with interpreting the data in light of the various road rules in all the different parts of the world where Tesla vehicles equipped with Autopilot are sold, seven of its employees noted that at times, the automaker took a more relaxed stance on these rules. 

The extent of this stance, some workers revealed, came as far as being told to not teach Autopilot to follow certain traffic signs like "No Turn On Red" or "No U-Turn" in the effort of making its systems drive the cars more "human-like." 

"It's a driver-first mentality," a former Tesla employee told BI. "I think the idea is we want to train it to drive like a human would, not a robot that's just following the rules."

More Business of EVs:

Additionally, much to the same tune of Facebook content moderators, these workers viewed some pretty disturbing footage day in and day out, which were not limited to Teslas getting into accidents and near misses. A few workers even disclosed to BI that a fellow employee shared a disturbing video involving a Tesla vehicle hitting a young boy on a bicycle as a joke. 

Similarly, Tesla employees told the publication that the company monitors the data annotation staff with surveillance cameras, as well as software that tracks their speed and keystrokes. On one given shift, they could be subjected to 5 to 7-1/2 hours annotating videos. 

"Sometimes it can get monotonous," an ex-Tesla employee said. "You could spend eight hours a day for months on end just labeling lane lines and curbs across thousands of videos."

Related: Tesla rival is not playing games with self-driving safety

A flawed 'safety system'

In late July, a report released by the Wall Street Journal found many foundational flaws and shortcomings associated with Autopilot and FSD. 

The outlet combed through details of more than 200 crashes involving Teslas equipped with Autopilot and FSD and found that most of its flaws can be attributed directly to the systems' overreliance on cameras and machine learning software.

Though Tesla's data annotators have told BI that they do their best to train the cameras to spot objects like road signs, stopped cars, trucks, or animals, there are still major gaps in what the software behind the cameras doesn't know. 

“The kind of things that tend to go wrong with these systems are things like it was not trained on the pictures of an over-turned double trailer – it just didn’t know what it was,” Phil Koopman, associate professor of electrical and computer engineering at Carnegie Mellon University, told the Journal.

“A person would have clearly said ‘something big is in the middle of the road,’ but the way machine learning works is it trains on a bunch of examples. If it encounters something it doesn’t have a bunch of examples for, it may have no idea what’s going on.”

The dangers that systems like Autopilot and FSD present can be fixed with the application of radar and LiDAR systems, however, Tesla CEO Elon Musk isn't a fan.

He has repeatedly said that this type of technology is "unnecessary" and that installing LiDAR on its cars would be like fitting it with a "whole bunch of expensive appendices."

Tesla, Inc. which trades on the NASDAQ under the ticker TSLA, is up 4.58% today, finishing the day at $226.17 at the time of writing.

Tesla did not immediately respond to a request for comment.

Related: Veteran fund manager sees world of pain coming for stocks

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.