Investing
Tesla says it is ‘morally obligated’ to continue improving Autopilot, reiterates safety claims
© Reuters. FILE PHOTO: The Tesla logo is seen at a dealership in Durango, northern Spain, October 30, 2023. REUTERS/Vincent West/File Photo
(Reuters) – U.S. automaker Tesla (NASDAQ:) Inc on Monday said it has a “moral obligation” to continue improving its Autopilot driver assistant system and make it available to more consumers based on data that showed stronger safety metrics when it was engaged.
In response to a Washington Post investigation of serious crashes involving Autopilot on roads where the feature could not reliably operate, the company said its data showed it was saving lives and preventing injuries.
The Post report said the newspaper had identified at least eight crashes between 2016 and 2023 where Autopilot could be activated in situations it was not designed to be used, and said Tesla had taken few definitive steps to restrict its use by geography despite having the technical capability to do so.
Autopilot is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic,” the Post said, adding Tesla’s user manual advises drivers the technology can also falter on roads if there are hills or sharp curves.
The Post investigation “leverages instances of driver misuse of the Autopilot driver assist feature to suggest the system is the problem”, Tesla said in a post on social media platform X, adding that Autopilot was about 10 times safer than the U.S. average and 5 times safer than a Tesla without the technology enabled.
The company also reiterated that the driver remained responsible for control of the vehicle at all times and is notified of this responsibility.
The Post said regulatory bodies like the U.S. National Highway Traffic Safety Administration (NHTSA) had not adopted rules to limit the technology to where it is meant to be used despite opening investigations into the software after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles.
NHTSA did not respond immediately to a request for comment from Reuters outside normal business hours. The agency told the Post it would be too complex and resource-intensive to verify that systems like Autopilot were used within the conditions for which they are designed, and it potentially would not fix the problem.
Last month, a Florida judge found “reasonable evidence” that Tesla Chief Executive Elon Musk and other managers knew the automaker’s vehicles had a defective Autopilot system but still allowed the cars to be driven unsafely.
The ruling came as a setback for Tesla after the company won two product liability trials in California this year over the Autopilot system.
Read the full article here
-
Side Hustles3 days ago
How Charlotte’s Rally Pickleball Got Its Start
-
Side Hustles4 days ago
The Day Trader’s Guide to Making Money Without Tying to a Desk
-
Make Money4 days ago
5 Surprising Ways Trump’s Trade Agenda Could Affect What You Pay at Checkout
-
Investing4 days ago
Quantum stock soars on new file system client By Investing.com
-
Investing6 days ago
Trump signals potential reconsideration of TikTok ban By Investing.com
-
Side Hustles1 day ago
Kickstart Your Year With These Entrepreneurial Health Checkups
-
Passive Income4 days ago
How Mission-Driven Leadership Fuels Growth in the Digital Era
-
Passive Income3 days ago
7 Things You Need to Consider Before Expanding Your Business