Consumer Watchdog strongly endorsed the California Department of Motor Vehicles’ draft regulations for autonomous vehicles that require self-driving robot cars have a licensed driver behind the wheel capable of taking control, and a working steering wheel, gas pedal and brake, during a public workshop in Sacramento.
“The need to require a licensed driver behind the wheel is obvious after a review of the results from seven companies that have been testing since September 2014: Robot cars are still not capable of dealing reliably with real-life situations,” said John M. Simpson, Consumer Watchdog’s Privacy Project director.
The organization called for transparency, stating that the DMV should require companies to disclose details of disengagements, simulations, accidents, and other failures of the autonomous technology. Any video or technical data linked to such incidents should be released. Simpson has decried previously that since the companies are self-reporting problems it’s like Volkswagen testing its own diesel cars.
Under the autonomous car testing regulations, the companies were required to file “disengagement reports” explaining when a human test driver had to take control. The reports show that the cars are not always capable of “seeing” pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars, suggesting too great a risk of serious accidents involving pedestrians and other cars. The cars also are not capable of reacting to reckless behavior quickly enough to avoid the consequences, the reports showed.
“The companies’ own evidence makes clear that a human driver able to take control of the vehicle is necessary to ensure the safety of both robot vehicles and other Californians on the road,” Simpson said at a DMV workshop on autonomous vehicle regulations.
“Google, which logged 424,331 “self-driving” miles over the 15-month reporting period, said a human driver had to take over 341 times, an average of 22.7 times a month. The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times,” Simpson said.
Other testing companies, driving far fewer autonomous miles than Google, also reported substantial numbers of disengagements. Bosch had 625 disengagements with 934.4 miles driven. Nissan with 1,485 miles driven had 106. Mercedes-Benz reported 1,031 with 1,738 miles driven. Delphi reported 405 disengagements with 16,662 miles. Volkswagen with 10,416 miles reported 260. Tesla claimed it had none, but did not say how many miles it drove.
It’s important to understand that these “disengagements” were promoted by real situations that drivers routinely encounter on the road, says Simpson. Among reasons cited by Bosch were failures to detect traffic lights and heavy pedestrian traffic.
Google’s robot technology quit 13 times because it couldn’t handle the weather conditions. Twenty-three times the driver took control because of reckless behavior by another driver, cyclist or pedestrian. The report said the robot car technology disengaged for a ‘perception discrepancy’ 119 times. Google defines such a discrepancy as occurring when the car’s sensors don’t correctly perceive an object, for instance over-hanging branches. The robot technology was disengaged 55 times for ‘an unwanted maneuver of the vehicle.’ An example would be coming too close to a parked car. The human took over from Google’s robot car three times because of road construction.
Consumer Watchdog points out that what the disengagement reports show is that there are many everyday routine traffic situations with which the self-driving robot cars simply can’t cope. Just as the draft regulations require, it’s imperative that a human be behind the wheel capable of taking control when necessary. Self-driving vehicles simply aren’t ready to safely manage many routine traffic situations without human intervention.