July 4th is the deadliest day of the year in the United States unfortunately just be America’s most patriotic holiday a Tesla driver died. He and the Autopilot did not see a white truck in a light sky, the car went under the truck killing the driver.
According to Tesla, they learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.
Joshua D. Brown, 40, of Canton, Ohio, died in the accident May 7 in Williston, Florida. Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was “playing Harry Potter on the TV screen” at the time of the crash and driving so quickly that “he went so fast through my trailer I didn’t see him.”
“It was still playing when he died and snapped a telephone pole a quarter mile down the road,” Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida.
Teslas statement contined “Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”
Tesla claims, “It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it.
NHTSA’s Office of Defects is handling the investigation.
Brown was love his Tesla S and poste videos of him in the car with Autopilot driving.
Consumer WatchDog sent out a cautionary note.
The news that a Tesla driving in autopilot mode was involved in a fatal collision is evidence that federal regulators need to go slow as they write new guidelines for self-driving cars, said Consumer Watchdog.
The National Highway Traffic Safety Administration was expected to issue new guidelines for self-driving cars in July and Secretary of Transportation Anthony Foxx and NHTSA director Mark Rosekind have publicly pressed for the rapid deployment of the technology. NHTSA should conclude its investigation into the Tesla crash and publicly release those data and findings before moving forward with its guidance, said Consumer Watchdog.
“We hope this is a wake-up call to federal regulators that we still don’t know enough about the safety of self-driving cars to be rushing them to the road. The Administration should slow its rush to write guidelines until the causes in this crash are clear, and the manufacturers provide public evidence that self-driving cars are safe. If a car can’t tell the difference between a truck and the sky, the technology is in doubt,” said Carmen Balber, executive director with Consumer Watchdog.
Self-driving cars in California have shown a similar inability to handle many common road situations. Under California’s self-driving car testing requirements, companies were required to file “disengagement reports” explaining when a test driver had to take control. The reports show that the cars are not always capable of “seeing” pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars. The cars also are not capable of reacting to reckless behavior of others on the road quickly enough to avoid the consequences, the reports showed.
For example, over the 15-month reporting period a human driver was forced to take over a Google self-driving car 341 times, an average of 22.7 times a month. The cars’ technology failed 272 times and ceded control to the human driver, and the driver felt compelled to intervene and take control 69 times.
Consumer Watchdog has called on NHTSA to hold a public rulemaking on self-driving cars, and to require the cars to have a steering wheel and pedals to allow a human driver to take over when the technology fails.
A commenter on his YouTube video wrote, “RIP Joshua. At least you can be assured that due to this horrible tragedy, the data gathered will no doubt save many more lives. You’ve died a hero in my eyes.”