A tweet from Elon Musk indicating that Tesla might allow some owners who’re testing a “Full Self-Driving” system to disable an alert that reminds them to maintain their hands on the steering wheel has drawn attention from US safety regulators.
The National Highway Traffic Safety Administration says it asked Tesla for more information concerning the tweet. Last week, the agency said the difficulty is now a part of a broader investigation into at the least 14 Teslas which have crashed into emergency vehicles while using the Autopilot driver assist system.
Since 2021, Tesla has been beta-testing “Full Self-Driving” using owners who haven’t been trained on the system but are actively monitored by the corporate. Earlier this 12 months, Tesla said 160,000, roughly 15% of Teslas now on US roads, were participating. A wider distribution of the software was to be rolled out late in 2022.
Despite the name, Tesla still says on its website that the cars can’t drive themselves. Teslas using “Full Self-Driving” can navigate roads themselves in lots of cases, but experts say the system could make mistakes.
“We’re not saying it’s quite able to haven’t any one behind the wheel,” CEO Musk said in October.
The National Highway Traffic Safety Administration said the difficulty is now a part of a broader investigation into at the least 14 Teslas which have crashed into emergency vehicles while using the Autopilot driver assist system.REUTERS
On Latest 12 months’s Eve, considered one of Musk’s most ardent fans posted on Twitter that drivers with greater than 10,000 miles of “Full Self-Driving” testing must have the choice to show off the “steering wheel nag,” an alert that tells drivers to maintain hands on the wheel.
Musk replied: “Agreed, update coming in Jan.”
It’s not clear from the tweets exactly what Tesla will do. But disabling a driver monitoring system on any vehicle that automates speed and steering would pose a danger to other drivers on the road, said Jake Fisher, senior director of auto testing for Consumer Reports.
“Using FSD beta, you’re form of a part of an experiment,” Fisher said. “The issue is the opposite road users adjoining to you haven’t signed as much as be a part of that experiment.”
Tesla didn’t reply to a message searching for comment concerning the tweet or its driver monitoring.
Auto safety advocates and government investigators have long criticized Tesla’s monitoring system as inadequate. Three years ago the National Transportation Safety Board listed poor monitoring as a contributing consider a 2018 fatal Tesla crash in California. The board really useful a greater system, but said Tesla has not responded.
Tesla’s system measures torque on the steering wheel to attempt to be certain that drivers are being attentive. Many Teslas have cameras that monitor a driver’s gaze. But Fisher says those cameras aren’t infrared like those of some competitors’ driver assistance systems, so that they can’t see at night or if a driver is wearing sunglasses.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, argued that Tesla is contradicting itself in a way that might confuse drivers. “They’re attempting to make customers joyful by taking their hands off the wheel, even while the (owners) manual says ‘don’t try this.’ ”
Since 2021, Elon Musk’s Tesla has been beta-testing “Full Self-Driving” using owners who haven’t been trained on the system but are actively monitored by the corporate.Getty Images
Indeed, Tesla’s website says Autopilot and the more sophisticated “Full Self-Driving” system are intended to be used by a “fully attentive driver who has their hands on the wheel and is ready to take over at any moment.” It says the systems usually are not fully autonomous.
NHTSA has noted in documents that quite a few Tesla crashes have occurred wherein drivers had their hands on the wheel but still weren’t being attentive. The agency has said that Autopilot is getting used in areas where its capabilities are limited and that many drivers aren’t taking motion to avoid crashes despite warnings from the vehicle.
Tesla’s partially automated systems have been under investigation by NHTSA since June of 2016 when a driver using Autopilot was killed after his Tesla went under a tractor-trailer crossing its path in Florida. The separate probe into Teslas that were using Autopilot once they crashed into emergency vehicles began in August 2021.
A Tesla Model 3 vehicle drives on Autopilot.REUTERS
Including the Florida crash, NHTSA has sent investigators to 35 Tesla crashes wherein automated systems are suspected of getting used. Nineteen people have died in those crashes.
Consumer Reports has tested Tesla’s monitoring system, which changes often with online software updates. Initially, the system didn’t warn a driver without hands on the wheel for 3 minutes. Recently, though, the warnings have are available in as little as 15 seconds. Fisher said he isn’t sure, though, how long a driver’s hands may very well be off the wheel before the system would decelerate or shut off completely.
In shutting off the “steering wheel nag,” Fisher said, Tesla may very well be switching to the camera to watch drivers, but that’s unclear.
Despite implying through the names that Autopilot and “Full Self-Driving” can drive themselves, Fisher said, it’s clear that Tesla expects owners to still be drivers. However the NTSB says human drivers can find yourself dropping their guard and relying an excessive amount of on the systems while looking elsewhere or doing other tasks.
Those that use “Full Self-Driving,” Fisher said, are more likely to be more vigilant in taking control since the system makes mistakes.
“I wouldn’t dream of taking my hands off the wheel using that system, simply because it could do things unexpectedly,” he said.