Quantcast
Channel: The Drive
Viewing all articles
Browse latest Browse all 23766

NTSB Says Limitations of Tesla's Autopilot Played a 'Major Role' in Fatal 2016 Crash

$
0
0

During a recent meeting of the U.S. National Transportation Safety Board, Tesla was slammed for letting users improperly use the autonomous features of its vehicles. Reuters reports that the automaker's software was missing key operational safeguards which could have prevented the crash by limiting the scope of the software's functionality and effectively preventing the misuse of the vehicle's Autopilot feature.

It's no secret that Tesla builds safe cars, but even the safest vehicles have limitations to preventing injury in an accident. Such was the case during the much-chronicled May 7, 2016 car accident between a truck and a Tesla Model S that occurred in Williston, Florida. 40-year-old Model S owner Joshua Brown died when the truck crossed the center line and struck the Tesla as it was zipping down the road.

The NTSB's argument isn't that Tesla built an unsafe car, or that Autopilot feature malfunctioned while the driver was behind the wheel—but rather that the car's software allowed the driver to engage Autopilot in an unsafe, unintended environment.

"System safeguards were lacking," said Robert Sumwalt, NTSB chairman. "Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention."

The NTSB also went on record to remind everyone of the fact that Tesla does not offer a fully self-driving car; rather, that it offers a semi-autonomous driving mode which allows drivers to use software-based aides to perform certain tasks on the driver's behalf, like steering, accelerating, and braking. Driver attention and intervention is expected in this form of Level 2 autonomous driving—indicating to the board that by allowing the driver to ignore the road, Tesla's software lacked particular safeguards that could have prevented the accident.

Tesla has previously noted that it believes Autopilot significantly increases the safety of vehicles on the road, producing governmental study data to back up the claims. Tesla did, however, agree with the NTSB that Autopilot is not fully self-driving, and should not be used as such.

The auto manufacturer noted that it would review the agency's recommendations for further action.


Viewing all articles
Browse latest Browse all 23766

Trending Articles