Autonomous & Self-Driving Vehicle News: Motional, Lyft, Ford/Argo AI, StradVision & Fortellix

In autonomous and self-driving vehicle news are Motional, Lyft, Ford/Argo AI, StradVision and Fortellix.

Motional & Lyft Back in Business in Vegas

Motional, a global driverless technology leader, and Lyft  announce the resumption of their pioneering self-driving mobility service in Las Vegas. The publicly-available autonomous fleet — the world’s longest-standing, which has provided more than 100,000 paid rides — consists of Motional robotaxis, operating on the Lyft network.

The fleet was paused earlier this year due to COVID-19.  Now, with the addition of enhanced protective measures, the service is again operational.

The fleet’s passengers will ride in the first-ever Motional-branded robotaxis, following the unveiling of the company’s new brand in August.

The new protective measures follow CDC, World Health Organization, and government guidelines, and include a partition between the front and rear seats; vehicle operator PPE; and vehicle sanitization at the start of each shift, the end of each day, and between ride

Ford Intros 4th Gen Argo AI

Launching a self-driving service is complex. Many different pieces need to come together to create a trusted and scalable self-driving service that provides value to customers and the cities they operate in. Ford is taking a thoughtful approach to  bring together all these pieces to help shape the future of self-driving vehicles. One important part of this service is the vehicle, which will allow us to stand up our self-driving business.

Beginning to roll out this month, Ford and Argo AI‘s fourth-generation self-driving test vehicles are built on the Escape Hybrid platform and feature the latest advancements in sensing and computing technology. The Escape Hybrid is also the architecture and platform Ford has chosen to use to bring our autonomous vehicle service online.

What This Means for Ford’s Self-Driving Service: The systems Ford’s incorporating into our newest test vehicles are “launch-intent” in terms of the components Ford believes will be needed to support commercialization. What this means is that with a well-defined architecture and platform in the Escape Hybrid, our team can continuously test and refine performance over the coming years to better prepare us for launch. Everything Ford learns while using them can be channeled directly into our self-driving service as soon as it starts serving customers.

Here is a glimpse of the engineering advancements you’ll see on the new vehicles.

All-new long-range LiDAR with higher resolution, 128-beam sensing to help provide a 360-degree field of view.

The addition of new near-field cameras and short-range LiDAR — looking ahead and to the side of the vehicle, while another is incorporated into a rear-facing sensing suite that keeps track of what’s going on behind the vehicle.

Combined, this helps improve detection of fixed and moving objects on all sides of the Escape, providing a blind spot curtain, detecting things like a passing car or a bicyclist in a nearby bike lane.

2. Battery Power That Really Delivers: Powering these sensors, as well as a state-of-the-art computing systems is the increased electrification capability of the Escape Hybrid.

The Escape Hybrid allows for improved integration and includes an underfloor liquid-cooled battery design.

Ford modified the Escape Hybrid’s high voltage battery with additional battery cells, which helps support the total battery power required by the self-driving system, while helping to reduce gasoline consumption.

3. Attention to Detail Sensor Cleaning: Over the last year, our team has refined the sensor cleaning system Ford developed based on on-road testing with our previous-generation test vehicles. Keeping our sensors clean from rain, dirt, debris and even insects is very important to ensure our vehicles can better “see” the world around them in a variety of challenging conditions.

The team has developed hidden, forced-air cleaning chambers that surround the camera lenses and LiDAR sensors to ensure their field of view is clear while providing 360-degree cleaning coverage.

Ford increased the number of spray nozzles and coverage areas for improved liquid cleaning, and increased pressure to aid cleaning speed.

Plus Ford extended these new cleaning designs to our added near-field cameras and LiDAR sensors as well.

With these enhancements and improved hydrophobic coatings, our latest test vehicles are much better equipped.

Why This Matters — How This Lays the Foundation for Our Business:

Ford claims to have said it before, but the vehicle is just one part of bringing together the future of self-driving services.

With the fourth-generation test vehicle, ford has everything it needs from a vehicle to stand up our self-driving service.
In parallel, Ford continues to build, test and strengthen our fleet operations strategy, take a smart approach to finalizing our moving people and moving goods go-to-market strategies, build our customer experience and work with Argo AI towards our goal of developing an industry-leading self-driving system.

A Blog Post states “We’re confident that we’re on the path to launching a safe, reliable and affordable service. And, we look forward to telling you more about how this service will ultimately help make people’s lives better.”

What’s Next: As they become ready for deployment, Ford will gradually integrate these fourth-generation vehicles into its multi-city testing efforts alongside our Fusion Hybrids in Austin, Detroit, Miami, Palo Alto, Pittsburgh and Washington, D.C.

StradVision’s Testworks

StradVision, whose AI-based camera perception software is a leading innovator in Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles, is collaborating with the South Korean social enterprise Testworks to improve the data processing efficiency and accuracy of its road safety technology such as its pioneering SVNet deep learning-based software.

One of SVNet’s key features is to quickly and accurately identify hazardous and potential road conditions using its deep-learning algorithm. In order to train SVNet’s deep neural network by generating annotated data, StradVision uses its own Auto Labeling Tool to automatically detect and label 94% of objects at eight times the speed of a human labeler. The remaining 6% requires human intervention, which is where StradVision and Testworks’ partnership comes in.

“I was looking for a company to entrust with manually labeling StradVision’s detail-sensitive data, and Testworks immediately came to mind because of their stellar reputation with data processing.  We support Testworks’ social mission and the quality of their work is world-class, so it is an easy decision for us to develop this partnership,” said StradVision CEO Junhwan Kim.

Empowering developmentally disabled individuals
StradVision works with Testworks, a top data processing company in South Korea, on data labeling while also providing inclusive employment opportunities for developmentally disabled individuals as well as those who may have difficulty finding work after a hiatus, such as retirees and mothers returning to the workplace.

StradVision and Testworks began “beta testing” their collaboration in 2017, shortly after Testworks ended a successful Autism@Work internship program.

“When I received the phone call from StradVision, I got goosebumps at the great timing,” said Testworks CEO Yoon Seok-won, who realized he could find full-time employment for some of Autism@Work’s successful interns. “It was an exciting opportunity to expand Testworks’ business from software testing to the new area of data processing for AI learning.”

Learning from experience
Testworks refined the operational procedures through trial and error as they were met with challenges over the course of the “beta” period. They introduced layers of data reviews and the addition of a project manager to the team, who could help communicate with full-time developmentally disabled employees to ensure the integrity of data processed for StradVision.

“Autistic individuals tend to have unique sensitivity that allows them to notice small things that non-autistic people would simply overlook, including minute errors in StradVision’s data,” said Yoon.

He found that the project manager, a mother who returned to the workplace, was particularly patient with her colleagues and ensured their work ran smoothly – which was immensely helpful in setting up a positive and accepting work environment.

Combined with the team’s attention to detail, this eventually improved the overall data quality to StradVision’s satisfaction.

Accelerating AV technological advancement
Testworks’ team now processes data from StradVision, correcting any labeling errors and adding important information – ensuring that their AI-based software learns from its mistakes and improves over time.

The project allows developmentally disabled individuals to make a significant contribution to the advancement of AV technology by pushing AI-based software to self-correct after the initial stage of human labeling. This significantly speeds up the growth of deep-learning advanced machine algorithms.

Fortellix’s ALKS Verification

Foretellix announced its new ALKS verification package, the world’s first commercial solution for the newly announced Automatic Lane-Keeping System (ALKS) regulation. The new package includes support of Mobileye Responsibility-Sensitive Safety (RSS) following a successful joint demonstration of ALKS safety regulation and compliance flow. The new verification package will help automakers address the regulatory and certification requirements of ALKS automated driving systems.

Foretellix has recently released its novel ADAS & Highway solution that provides an out-of-the-box, comprehensive verification package from Level 2 driver-assist technologies to Level 4 highway-focused, fully autonomous solutions. With the introduction of the new package, the solution now includes the first-ever commercial implementation of new ALKS regulation by the UNECE, the world’s first Level 3 regulation.

In a joint demonstration with Mobileye, the ALKS verification package generated the required regulation scenarios and parameters and provided measurable metrics on the RSS controlled vehicle behavior. The package used Mobileye’s RSS model to ensure that the tested vehicle does not initiate dangerous situations and properly responds to dangerous situations initiated by other vehicles. These scenarios were varied over a wide range of parameter values, demonstrating that safety as well as regulatory requirements are met.

“We are excited to show how our Coverage Driven verification approach, combined with Mobileye’s RSS, helps bring the industry closer to the goal of measurable safety,” said Gil Amid, Chief Regulatory Affairs Officer and Foretellix co-founder.

“Regulators will require proof of a vehicle’s ability to avoid reasonably foreseeable and preventable collisions,” said Jack Weast, Vice President of AV standards at Mobileye. “The Foretellix Foretify approach using RSS gives OEMs a way to demonstrate compliance with the most advanced automated driving regulation in the world.”