Elon Musk’s Tesla Autopilot and safety claims not iron clad
It's all in the details
Elon Musk is on record denying statements by police that evidence suggests Tesla’s Autopilot system was in use during a fiery, high-speed crash on a residential street in Spring, Tex., on Saturday night that claimed the lives of two passengers, neither of whom was found in the driver’s seat.
Ticker | Security | Last | Change | Change % |
---|---|---|---|---|
TSLA | TESLA INC. | 352.86 | +13.22 | +3.89% |
Musk Tweeted on Tuesday that information recovered from the car backed up this assertion.
"Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD," Musk wrote. FSD is short for Tesla’s optional Full Self-Driving feature, which has additional partially-automated driving capabilities
Musk’s claim has not been independently verified, but the Harris County Police Department said it would serve a warrant to obtain the data.
"If he is tweeting that out, if he has already pulled the data, he hasn’t told us that," Harris County Constable Precinct 4 Mark Herman told Reuters. "We will eagerly wait for that data."
"We have witness statements from people that said they left to test drive the vehicle without a driver and to show the friend how it can drive itself," Herman said.
Musk also agreed with a Twitter commenter who said that Autopilot would not operate without anyone in the driver’s seat or touching the steering wheel every 10 seconds and that it won’t break the speed limit. He added that "standard Autopilot would require lane lines to turn on, which this street did not have."
The capabilities of Autopilot have evolved over the years, and it’s not yet known exactly what version the 2019 Model S that crashed was equipped with.
"Currently Autopilot is just adaptive cruise control and hands-on lane centering, the same basic functionality you get from many automaker’s driver assist systems," Guidehouse Insights E-Mobility principal analyst Sam Abuelsamid told FOX Business.
"All other features like Auto Lane Change, Navigate on Autopilot, Autopark, Summon, etc. are part of the Full Self-Driving package. However, several of these used to be part of what was branded as enhanced Autopilot."
Tesla owner Sergio Rodriguez responded to Musk’s statement by posting a now-viral video of his car engaging Autopilot on side streets with no lane markers and registering what appeared to a speed limit that was too high for the stretch of road.
Fox News has previously tested Autopilot and determined that it was capable of operating without anyone touching the wheel for around a minute, at least, under certain circumstances. Tesla’s own documentation says only that the feature will disengage "if it does not detect your hands on the steering wheel for a period of time," without specifying how long that period is.
"The fundamental problem here is that Tesla does a poor job of driver monitoring. Unlike several other automakers, Tesla only uses a torque sensor in the steering wheel to try to detect when the driver is moving the wheel. This is a cheap but very imprecise method," Abuelsamid said.
A video posted to TikTok last September showed three people sitting in a Model S driving down a three-lane highway with an empty driver’s seat, although it is not known if it was rigged in some way to defeat the monitoring system.
Ticker | Security | Last | Change | Change % |
---|---|---|---|---|
GM | GENERAL MOTORS CO. | 55.68 | +0.81 | +1.48% |
General Motors’ similar Super Cruise feature, which is advertised as hands-free, uses facial recognition technology to ensure that a driver is watching the road while it is in operation and recently ranked higher than Autopilot in a Consumer Reports test.
The Tesla Model 3 and Model Y are equipped with an in-cabin camera that Musk recently revealed is able to monitor a driver’s attention and said that the company had rescinded access to the latest version of Full Self-Driving from several people it determined were misusing the feature.
"More problematic is that Elon Musk regularly does TV interviews and exhibits the same hands-off behavior so his fans do the same thing," Abuelsamid said.
"Musk frequently likes and retweets videos posted by Tesla drivers not paying attention while using Autopilot/Full Self-Driving. If the CEO of the company does this, many of his fans assume it must be ok to ignore the warnings," Abuelsamid said.
Musk did this himself during a 60 Minutes segment and often comments on videos posted by the Whole Mars Catalog Twitter account depicting long-distance hands-free trips using Full Self-Driving with someone seated in the driver’s position.
No laws specifically address the manner in which these types of systems should be deployed, so the hands-on, hands-off question is currently up to the automaker and its lawyers.
The Texas crash occurred the same day Musk tweeted that Teslas with Autopilot engaged have nearly a 10 times lower chance of being in an accident than an average vehicle."
However, if Musk’s assertions about Autopilot’s functionality are accurate, the statistics aren’t directly comparable, because many accidents occur on roads where it isn’t capable of being used.
Abuelsamid said it is unknown if Tesla is counting instances where Autopilot disengaged seconds before an accident it couldn’t avoid.
"The problem with this claim is that there is no context or detail provided. Tesla is transparent only with the stats they want to share, but they actually share very little," Abuelsamid said.
The report also indicates that Teslas not using Autopilot or any active safety features are half as likely to get into an accident than the average car.
"I would not put any credence in these stats without a careful third-party evaluation of all the data."
The National Highway Traffic Safety Administration and National Transportation Safety Board have each sent teams to Texas to conduct their own investigations of the incident.