Solutions to first Tesla death? Groups & experts offer suggestions

autopilotTeslaThe first death of a Tesla driver using autopilot has led to organizations and people calling for solutions. A fellow at Standford suggests that there should be infrastructure created for self-driving cars. Consumer Watchdog is asking Tesla to stop bets-testing  autopilot with live people. An automotive organization pointed out that the classic automakers test features over millions of miles before deployment.

Give Them Their Own Lane

Dr. Jerry Kaplan, a fellow at the Center for Legal Informatics at Stanford University, suggests that self-driving cars should have their own lanes with V2V(vehicle-to-vehicle) V2I(vehicle-to-infrastructure) and V2X features.

“Perhaps we could start by reserving high-occupancy-vehicle lanes or certain roads at specific times for automated vehicles. The increase in road-use efficiency might be enough to compensate for the reduced number of standard lanes. As the mix of vehicles shifted toward self-driving cars and trucks, more roadways could be switched over. Eventually, driving your own car might be something just for local streets or an activity for special recreational areas.”

Stop the Beta Software

Meanwhile, Consumer Watchdog called out Elon Musk to stop beta-testing software with human lives. Consumer Watchdog called on Tesla Chairman and CEO Elon Musk to disable his car’s “autopilot” feature until it is shown to be safe and said if it is offered in the future Tesla must pledge to be liable if something goes wrong with its self-driving technology.

Consumer Watchdog also expressed concern that Tesla was developing a pattern of blaming victims in crashes when the autopilot feature was engaged.

In a letter to Musk, signed by President Jamie Court, Executive Director Carmen Balber and Privacy Project Director John M. Simpson, the public interest group also criticized Tesla’s delay in revealing the fatal Florida accident.

“On the one hand you extoll the supposed virtues of autopilot, creating the impression that, once engaged, it is self-sufficient.  Your customers are lulled into believing their car is safe to be left to do the driving itself. On the other hand you walk back any promise of safety, saying autopilot is still in Beta mode and drivers must pay attention all the time. The result of this disconnect between marketing hype and reality was the fatal crash in Florida, as well as other non-fatal crashes that have now come to light.”

One of the most troubling aspects of Tesla’s deployment of autopilot is the decision to make the feature available in so-called Beta mode, Consumer Watchdog said, an admission it’s not ready for prime time.

Consumer Watchdog said that if autopilot is proven safe to deploy Tesla must assume liability for any crashes that occur when the feature is engaged.
Consumer Watchdog’s letter concluded:

“Tesla is rushing self-driving technologies to the highways prematurely, however, as the crashes demonstrate, autopilot isn’t safe and you should disable it immediately. If autopilot can ultimately be shown to meet safety standards and is then redeployed, you must pledge to be liable if anything goes wrong when the self-driving system is engaged.”

Test Like the Best

“No manufacturer should ever put a beta system on the road and make consumers the test drivers,” said Clarence Ditlow, executive director of the Center for Auto Safety told USA Today. Other automakers put millions of miles on major safety systems before they are deemed ready for prime time, he says, including many hours at their own test tracks.”

Go to Canada?

In Canada there isn’t a big demand for self-driving.As of Wednesday, not a single company had applied to take part in Ontario’s program to test autonomous vehicles on its streets, although the program was launched on January 1. A spokesman from Ontario’s Ministry of Transportation said that the program had generated significant interest. But interest hasn’t translated into much (or any) action.