Tesla Autopilot Crashes Shouldn't Scare You — Unless You're Elon Musk

Impact

Another Tesla crashed using Autopilot — the passengers survived, but this closely follows the first fatal accident of a Tesla driver using the same feature.

Jalopnik pointed out that a blind spot in the Autopilot system may have caused the fatal accident in the Tesla Model S. Tesla confirmed that it was a visual error.

"Neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," a Tesla Motors news release said. 

JOHANNES EISELE/Getty Images

Should you start to panic? The answer is still no — driverless cars still prove more efficient and safe in contrast to human drivers — but the incidents do show that we may be depending on these autonomous features too heavily and too soon. Autopilot is not synonymous with fully autonomous. 

It's worth noting that Tesla said that neither Autopilot nor the driver noticed the oncoming vehicle — that's because Tesla isn't yet ready to give the Autopilot feature full responsibility of the vehicle. Tesla wants its Autopilot feature to relieve drivers of the "most tedious and potentially dangerous aspects of road travel," but the driver is still expected to assume responsibility of the vehicle.

"While truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear," Tesla Motors said in a blog post. "The driver is still responsible for, and ultimately in control of, the car."

Ringo H.W. Chiu/AP

Should Tesla start to panic? Maybe. The potential problem for Tesla is the branding of the Autopilot feature. Stephen Nichols, an attorney in the Los Angeles office of law firm Polsinelli, told the Los Angeles Times that Tesla's branding of the feature suggesting automation, unlike its driverless vehicle competitors, could shift the blame from the driver to Tesla.

"You could say, 'Tesla, you're not doing what these other companies are doing, so you're being unreasonable,'" Nichols said, according to the Los Angeles Times.

As for the most recent, non-fatal Tesla crash involving Autopilot, Dale Vukovich of the Pennsylvania State Police said there isn't enough evidence pointing to an Autopilot malfunction, according to the Detroit Free Press. The Pennsylvania Turnpike the vehicle was on leaves "little margin for driver error." 

If these drivers were misled to believe the Autopilot feature necessitated less human involvement (or any human attention at all) due to the way it is advertised, the onus could be on Tesla.

"On the one hand, they're saying trust us, we can drive better than you would, but on the other hand, they are saying if something goes wrong, don't ask us to stand behind our product," Rosemary Shahan, president of the Consumers for Auto Reliability and Safety lobbying group, told the Los Angeles Times. "But if it's controlled by an algorithm, why should you be liable?"

Read more: