Self-Driving Autonomous Vehicles-Not Safe Says Watchdog

Autonomous Vehicles are not safe to be deployed on public roads Consumer Watchdog told the U.S. Senate today, basing its warning on an analysis of required reports from companies testing robot cars in California and called on senators to halt a bill that would allow robot cars on public roads.

 The Senate is considering a robot car bill, the AV Start Act, S. 1885, which was approved by the Commerce, Science, and Transportation Committee last year. Sen. Dianne Feinstein, D-CA, has placed a hold on the bill because she is concerned about the safety of robot cars and whether the technology is ready for public roads.

In an open letter to Senators, John M. Simpson, Privacy and Technology Project Director and Sahiba Sindhu, Consumer Advocate, warned senators the technology is not ready for safe deployment.

“It would be a great threat to the public for the Senate to authorize the deployment of robot cars without protections requiring certification of the vehicles when testing shows the state of the technology imperils the public if a human driver cannot take over the car,” they wrote.

The California reports revealed that robot cars tested could not cope when faced with the task of making some decisions humans make every day when they drive.  Among the failures that required the human test driver to take control:

  • GPS signal failure,
  • shorter-than-average yellow lights,
  • rapid fluctuations in street traffic,
  • sudden lane blockages,
  • cars parked incorrectly nearby
  • hardware failure
  • software failure

“We need to verify that self-driving cars can actually drive themselves before we put them on public roads. What makes a car self-driving other than an opinion of a car manufacturer interested in selling their product? Legislation must protect the public by designating standards that guarantee that new vehicles on the road can meet their purported capabilities,” said Simpson and Sindhu in their letter to the Senate.

Twenty companies released the only publicly available data about the state of robot car technology to the California Department of Motor Vehicles. The required “disengagement reports” released last week show so-called self-driving cars cannot go more than 5,596 miles in the best-case scenario without a human test driver taking over at the wheel. In most cases, the vehicles cannot travel more than a few hundred miles without needing human intervention, Consumer Watchdog noted.

Based on its analysis of the disengagement reports, the nonprofit, nonpartisan public interest group called on the Senate to halt the AV START Act:

“Consumer Watchdog calls on you to act to protect highway safety and halt the AV START Act, S. 1885, unless it is amended to require enforceable safety standards that apply specifically to autonomous technology. For now, given the state of the technology as indicated by developers themselves, any AV legislation should require a human driver behind a steering wheel capable of taking control.”

Consumer Watchdog called for “carefully crafted regulations, designated performance metrics, and a system of certification that guarantees the technology will not imperil the public if a human driver cannot take over the so-called ‘self-driving’ vehicle.”

Read Consumer Watchdog’s letter to the Senate here.

Twenty companies with permits to test robot cars in California were required to file “disengagement reports”, covering 2017 listing miles driven in autonomous mode and the number of times the robot technology failed. The reports were released last week. Nine of those companies, including Waymo (a subsidiary of Google’s parent company) and GM Cruise, offered specific data showing reasons their robot technology failed.

Read the 2017 disengagement reports here.

Waymo said that its robot car technology disengaged 63 times, or once every 5,596 miles because of deficiencies in the technology and not “extraneous conditions” such as weather, road construction, or unexpected objects, as often presumed. The most common reasons why human test drivers had to take control of a robot car were deficiencies in hardware, software, and perception, Waymo’s report said.

GM’s Cruise division, which claims it will put robot cars on the road for public use in 2019, logged the second-most miles of the companies that were required to report on their testing.  Its cars drove, a total of 131,675 miles and had 105 disengagements or one every 1,254 miles.

GM Cruise’s report revealed that its robot cars cannot correctly predict the behavior of human drivers, as 44 out of the 105 disengagements (about 40%) in which a driver took control were cases where GM Cruise’s technology failed when trying to respond to other drivers on the road.

All other companies that released specific data detailing reasons for the disengagements, including Nissan and Drive.ai, a technology startup partnered with Lyft, confirmed Waymo’s and GM Cruise’s experiences. Nissan said it tested five vehicles, logged 5007 miles and had 24 disengagements. Meanwhile, Drive.ai had 151 disengagements in the 6,572 miles the company logged.

Consumer Watchdog’s letter said:

“The purported intention of S. 1885 is to improve highway safety through the deployment of Highly Automated Vehicle (HAV) technologies. Commerce Committee Chairman Senator John Thune claimed that ‘the safety…benefits of self-driving vehicles are too critical to delay.’ Yet, the facts show that these cars may impose more of a risk to the public than the safety private AV technology manufacturers have misleadingly guaranteed to the public.”