Tesla Autopilot mode is assisted driving: cybersecurity expert
Tesla drivers who engage the Autopilot feature should recognize that it requires the assistance of the driver, a cybersecurity expert told FOX Business.
“They require in the manual that people constantly have their hands on the wheel,” cybersecurity expert Leeza Garber said during an interview on “The Evening Edit” on Wednesday.
Elon Musk’s electric car company is under scrutiny after another crash involving one of its vehicles in Autopilot mode. A Model S crashed into a parked police cruiser Tuesday in Laguna Beach, California.
The driver says the Tesla sedan was running in its Autopilot mode right before the impact.
Garber said the vehicle is designed with certain technological built-in features to make highway driving safer.
“In essence what they are doing is mapping out what kind of obstacles are on the road,” she said. “They are watching the speed limit, they are watching traffic.”
The Tesla Model S owner’s manual states that Autopilot “cannot detect all objects and may not brake/decelerate for stationary vehicles or objects especially traveling 50 mph.”
Last week, Tesla settled a class-action lawsuit with buyers of the Model S and Model X who claimed the company’s assisted-driving Autopilot system was “completely inoperable,” according to the complaint.
“Even though they set aside this caveat for its not autonomous driving, you have to hold on to the wheel, you have to pay attention and be attentive, but at the same time are they going to be held liable and this crash is not helping them,” Garber said.