TESLA Full-Self Driving Can’t Tell If Driver is Sleeping, a Teddy Bear or Invisible

The Dawn Project’s videos show Tesla’s driver monitoring system fails to detect sleeping driver, teddy bear, and no one at the wheel

The Dawn Project has today released safety test videos showing that a self-driving Tesla’s driver monitoring system fails to detect when a driver texts, reads, watches movies or even falls asleep at the wheel. The car also does not recognize when a teddy bear, unicorn, or nothing at all is in the driver’s seat.

Tesla warns that its self-driving software “may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road”. It also warns that the software can cause the car to “suddenly swerve even when driving conditions appear normal and straight-forward”.

Regulators have only allowed this defective software to be sold to millions of ordinary consumers with the requirement that there is a driver in the car who is paying attention to the road, has both hands on the steering wheel, and is ready to take over immediately.

Research has shown that the only way to ensure a driver is paying attention is to implement an effective driver monitoring system using cameras.

Tesla duped NHTSA and the California DMV to designate Full Self-Driving as a Level 2 Advanced Driver Assistance System, while simultaneously marketing it as a fully autonomous car.

Joshua Brown’s fatal self-driving collision with a tractor-trailer in June 2016 was attributed to driver inattention by the National Transportation Safety Board (NTSB). The NTSB said the truck should have been visible to Brown for at least seven seconds before impact. Brown “took no braking, steering or other actions to avoid the collision”, the NTSB report said. As a result, NHTSA required Tesla to add a driver monitoring system.

The latest tests were conducted on a real road in Santa Barbara, with a person in the passenger seat ready to take over. The tests were conducted to establish whether Tesla’s driver monitoring system would detect and issue a “pay attention to the road” warning when faced with the below scenarios:

  • A driver watching a video on a laptop while at the wheel
  • A driver falling asleep at the wheel
  • A driver texting at the wheel
  • A driver reading a book at the wheel
  • A teddy bear at the wheel
  • A unicorn at the wheel
  • An inflatable bottle of champagne at the wheel
  • No one in the driver’s seat

The Dawn Project invited John Bernal, a former Tesla Autopilot employee, who covers electric vehicle news on his YouTube Channel, AIAddict, to conduct and observe the tests involving non-human objects in the driver’s seat. The tests were run in two separate Model 3 Teslas, which had not been modified in any way.

These tests follow a video recently released by The Dawn Project showing a self-driving Tesla driver looking out of the window for five minutes while eating a meal, as well as rummaging in the back seats for five minutes, all without receiving any warnings from the driver monitoring system.

Dan O’Dowd, Founder of The Dawn Project, commented: “Tesla’s driver monitoring system is ineffective and unfit for purpose. NHTSA forced Tesla to introduce a driver monitoring system to ensure the driver is paying attention to the road. However, Tesla duped the regulator by implementing an ineffective driver monitoring system.

“Did Tesla knowingly ship a defective driver monitoring system that fails to detect driver inattention?

“This ineffective driver monitoring system is in over 4 million Tesla vehicles made in the last five years. We tested it on two cars, and achieved the exact same results. Pedestrians, cyclists and drivers have no way of knowing whether the person “supervising” an ineffective self-driving Tesla is actually paying attention to the road, or is asleep at the wheel.

“In order to avoid the stringent regulatory approval process applied to Level 4 autonomous vehicles, Tesla duped NHTSA and the DMV into regulating Autopilot and Full Self-Driving as Level 2 Advanced Driver Assistance Systems.

“Our videos show that Teslas will drive autonomously with no one sitting in the driver’s seat and the steering wheel moving back and forth, just like in a Waymo or Cruise Level 4 autonomous robotaxi. Teslas are Level 4 autonomous vehicles just like the names “Autopilot” and “Full Self-Driving” promise, as had always been Tesla’s intention. Elon Musk recently boasted that people “only fully understand when they’re in [the] driver’s seat, but aren’t driving at all.”

“Tesla has reported 840 accidents and 23 fatalities to the NHTSA. Tesla has made over $4 billion dollars of additional revenue from its self-driving software.

“A number of fatalities occurred when a self-driving Tesla failed to recognise a large tractor trailer crossing the highway in front of the vehicle, shearing off the Tesla’s roof and killing the driver.

“This raises the question, why didn’t the driver brake? A tractor trailer is usually a very easy thing for a person to see. However, in many of the accidents, it appears that the driver took no action whatsoever to avoid the fatal collision. The only reason the driver would not brake before hitting such a large object is because they were not paying attention.

“If the driver wasn’t paying attention, why weren’t they warned by Tesla’s driver monitoring system, and why didn’t the system disengage self-driving mode as it is supposed to?”