

Please try another search
By David Shepardson
(Reuters) - Democratic U.S. Senator Ed Markey asked Tesla Inc (O:TSLA) on Wednesday to disable its "Autopilot" driver-assistance system until it installs new safeguards to prevent drivers from evading system limits that could let them fall asleep.
"Tesla should disable Autopilot until it fixes the problem, Markey said at a Senate Commerce Committee hearing on advanced vehicle technologies.
Markey, who wrote to Tesla about the issue earlier this week, cited YouTube videos and press reports that suggested drivers could travel long distances without touching the steering wheel by using an object to defeat requirements that drivers should regularly touch the wheel "even if they are literally asleep."
Markey cited a local news report that said a driver had fallen asleep behind the wheel as a Tesla drove 14 miles on Autopilot. Other unconfirmed videos on social media appear to show drivers sleeping behind the wheel of Tesla vehicles.
"That's not safe. Somebody is going to die because they can go to YouTube as a driver - find a way to (get around safety requirements)," Markey said. "We can't entrust the lives of our drivers and everyone else on the road to a water bottle."
Acting National Highway Traffic Safety Administration (NHTSA) chief James Owens told Markey at the hearing Wednesday the agency would be in touch with Tesla about the issue.
Tesla says drivers must keep their hands on the wheel at all times, but many owners say they can use the driver-assistance system to conduct other tasks behind the wheel.
Tesla did not immediately comment but said in September that since 2018, it has "made updates to our system, including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated."
A series of crashes involving Autopilot has prompted U.S. investigations and criticism from the National Transportation Safety Board (NTSB).
In September, the NTSB said the Autopilot design was a key factor in a January 2018 crash of a Model S into a parked fire truck on a highway in California. The system’s design “permitted the driver to disengage from the driving task” in the 2018 crash and allowed him to remove his hands from the wheel for nearly all of the last 14 minutes of the trip, it said.
Tesla’s Autopilot was engaged during at least three fatal U.S. crashes, and two remain under investigation by NHTSA and NTSB.
Are you sure you want to block %USER_NAME%?
By doing so, you and %USER_NAME% will not be able to see any of each other's Investing.com's posts.
%USER_NAME% was successfully added to your Block List
Since you’ve just unblocked this person, you must wait 48 hours before renewing the block.
I feel that this comment is:
Thank You!
Your report has been sent to our moderators for review
Add a Comment
We encourage you to use comments to engage with other users, share your perspective and ask questions of authors and each other. However, in order to maintain the high level of discourse we’ve all come to value and expect, please keep the following criteria in mind:
Enrich the conversation, don’t trash it.
Stay focused and on track. Only post material that’s relevant to the topic being discussed.
Be respectful. Even negative opinions can be framed positively and diplomatically. Avoid profanity, slander or personal attacks directed at an author or another user. Racism, sexism and other forms of discrimination will not be tolerated.
Perpetrators of spam or abuse will be deleted from the site and prohibited from future registration at Investing.com’s discretion.