06/13/2023 / By Arsenio Toledo
Teslas in autopilot mode has caused no less than 736 crashes since 2019 – including at least 17 fatal incidents, 11 of them since May 2022, and five serious injuries. This represents over 90 percent of automation-related car crashes.
A recent analysis of National Highway Traffic Safety Administration (NHTSA) data found that the number of crashes attributed to Tesla vehicles using its driver-assistance technology has surged over the past four years, reflecting the hazards associated with its increasingly widespread use as well as the growing presence of these types of vehicles in American roads. (Related: Over 1.1 million Tesla electric cars in China RECALLED over dangerous braking defect.)
Former NHTSA senior safety adviser Missy Cummings has attributed the surge to Tesla’s decision to expand the use of its so-called Full Self-Driving technology despite the fact that such technology should still be considered in its beta testing stages at best.
The NHTSA’s data records 807 automation-related crashes, 736 of them involving Tesla vehicles. Subaru ranks second with only 23 reported crashes since 2019, representing an enormous gulf with Tesla due to the wider deployment and use of automation across Tesla’s fleet of vehicles, as well as the wider range of circumstances in which Tesla encourages drivers to engage its Full Self-Driving tech. Over 400,000 Tesla owners in the U.S. and Canada have access to the Full Self-Driving feature.
“Tesla is having more severe – and fatal – crashes than people in a normal data set,” she said. “The fact that … anybody and everybody can have [Full Self-Driving]… Is it reasonable to expect that might be leading to increased accident rates? Sure, absolutely.”
Tesla has not immediately responded to requests for comment. But CEO Elon Musk has repeatedly defended his decision to push more Tesla owners to acquire driver-assistance tech, claiming that the benefits outweigh any potential harms.
“At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people,” said Musk. “Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they definitely know – or their state does.”
What the analysis of NHTSA data does not show is the fact that a Tesla whistleblower leaked over 100 gigabytes of data containing thousands of customer complaints that raise serious concerns about the safety of Tesla’s Full Self-Driving features.
The complaints, coming from the U.S., Europe and Asia, span from 2015 to March 2022. During this period, Tesla customers reported over 2,400 self-acceleration issues and 1,500 braking problems, including 139 reports of “unintentional emergency braking” and 383 reports of “phantom stops” from false collision warnings.
The Tesla files include details of how cars “suddenly brake or accelerate abruptly.” While some drivers safely gained control of their vehicles, there were many instances of cars that skidded into ditches, crashed into walls or even crashed into oncoming vehicles.
The whistleblower data was leaked to the German business-oriented newspaper Handelsblatt, whose editor-in-chief Sebastian Matthes sent Tesla questions about the data it received and offered it an avenue to respond to the whistleblower data.
Instead of answering or rebutting, Matthes said Tesla “demanded that the data be deleted and spoke of data theft.”
This information makes Tesla’s technology even more untrustworthy and could potentially add fuel to the NHTSA’s ongoing investigation into Tesla’s use of driver-assistance tech.
For its part, Tesla claimed it has done nothing wrong since it regularly advises drivers to expect to take control of their vehicles at a moment’s notice even when the driver-assistance options are engaged.
“There is a real concern that’s not limited to the technology itself but the interaction between the technology and the driver,” said Secretary of Transportation Pete Buttigieg, whose department oversees the NHTSA. “The question is not, ‘are they absolutely free of problems or 1,000 percent bulletproof?’ The question is, how can we be sure that they will lead to a better set of safety outcomes.”
Learn more about tech-heavy vehicles like electric cars and cars with self-driving options at RoboCars.news.
Watch this investigation looking into why Elon Musk and globalist Yuval Noah Harari fully agree on self-driving cars.
This video is from the Thrive Time Show channel on Brighteon.com.
Tesla to expand lithium refining capacity to meet growing demand for EV batteries.
Tesla announces $775 million investment in Texas Gigafactory, slashes prices on EVs.
Social media influencer exposes problems that come with owning a Tesla Model 3.
German police officer chase Tesla on autopilot, driver literally asleep.
Trial involving Tesla’s autopilot could decide if tech or driver is responsible for fatal crashes.
Sources include:
Tagged Under:
autonomous cars, autonomous vehicles, autopilot, badtech, car accidents, car crashes, cars, computing, Dangerous, deaths, driverless cars, Full Self-Driving, future tech, Glitch, information technology, self-driving cars, tehcnology, tesla
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 ROBOTICS.NEWS
All content posted on this site is protected under Free Speech. Robotics.News is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Robotics.News assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.