Autonomous and Self-Driving Vehicle News: esla, FedEx, Nuro, Valeo, Navya, TSR, Humanising Autonomy, TuSimple, Pony.ai, Sense Photonics, Hprobe, Plus, Ridecell, Ford & Argo AI

In autonomous and self-driving vehicle news are Tesla, FedEx, Nuro, Valeo, Navya, TSR, Humanising Autonomy, TuSimple, Pony.ai, Sense Photonics, Hprobe, Plus, Ridecell, Ford and Argo AI.

NHTSA Investigates Tesla

NHTSA is investigation 30 Telsa collisions including 10 fatalities, starting this March Most of the accidents fall under NHTSA’s Special Crash Investigations involve use of Autopilot.

FedEx Partners with Nuro

FedEx Corp. and Nuro announce a multi-year, multi-phase agreement to test Nuro’s next-generation autonomous delivery vehicle within FedEx operations. The collaboration between FedEx and Nuro launched in April with a pilot program across the Houston area. This pilot marks Nuro’s expansion into parcel logistics and allows FedEx the opportunity to explore various use cases for on-road autonomous vehicle logistics, including multi-stop and appointment-based deliveries. The Nuro pilot is the latest addition to the FedEx portfolio of autonomous same-day and specialty delivery devices.

The exponential growth of e-commerce has accelerated the demand for reliable, autonomous solutions throughout all stages of the supply chain. FedEx believes that continued innovation and automation will improve safety, efficiency, and productivity for the company’s more than 570,000 team members as they continue to move the world forward.

Nuro has been developing and testing its self-driving technology for nearly five years, including on-road deployment in multiple cities and industry-first regulatory approvals. The company has established partnerships with leaders in grocery, restaurant, and pharmacy verticals. This collaboration is a major step for Nuro in entering parcel logistics.

Valeo and Navya

Valeo and Navya have decided to step up their technological and industrial collaboration in the field of autonomous shuttles.

The Navya shuttles (180 units sold as at December 31, 2020) which are operated worldwide are already equipped with Valeo technologies. The aim is to ramp up the Research and Development program to build level 4 autonomous driving systems that can be brought to market within the next three years.

Valeo will provide Navya with the sensors and associated algorithms that will enable the vehicle to closely perceive its surroundings, and Navya will share the technical and functional data collected during experimentations. At the end of this phase, Valeo will manufacture and supply the selected components, so that they can be integrated in the autonomous driving solutions marketed by Navya.

From a technological perspective, the collaboration will focus in particular on the following components: cameras, artificial intelligence softwares and electronic control units (ECUs).

In terms of driving assistance (ADAS), Valeo has the most extensive portfolio of technologies on the market, all of which are manufactured on a large scale. They include ultrasonic sensors, cameras, radars and the first 3D LiDAR to enter series production and meet the demanding specifications of the automotive industry

Valeo also provides the brain of the technology – the control unit – which combines and processes the data collected. The control unit maps out a detailed 360° image of the vehicle’s surroundings and uses algorithms to detect objects and provide safety functions.

This technological, industrial and commercial partnership builds on the longstanding cooperation agreements between the two companies, particularly those concerning the sensors used on the Autonom Shuttle. The applications resulting from the first phase of development are expected in the third quarter of 2022.

TSR and Humanising Autonomy

Together for Safer Roads (TSR) is proud to identify innovative solutions to this problem with its member organizations and partners Anheuser-Busch InBev, Republic Services, and New York City Department of Citywide Administrative Services, by announcing the winner of the inaugural Truck of the Future program: Humanising Autonomy.

Humanising Autonomy will use its computer vision and behavior AI technology to add a nuanced understanding of VRU (vulnerable road user) behavior to fleets. The software will help improve fleet driver visibility and reaction times, making roads safer for vulnerable users. This announcement is especially meaningful following the UN Global Road Safety Week, as the Truck of the Future program is intended to improve fleet safety through cutting edge technology and ultimately, save lives.

Waymo Gets $2.5 B Investment

Waymo reported  it latest investment round of $2.5 billion, with participation from Alphabet, Andreessen Horowitz, AutoNation, Canada Pension Plan Investment Board, Fidelity Management & Research Company, Magna International, Mubadala Investment Company, Perry Creek Capital, Silver Lake, funds and accounts advised by T. Rowe Price Associates, Inc., Temasek, and Tiger Global. The company will use this latest round of investment to continue advancing the Waymo Driver, as well as continuing to the team.

TuSimple Opens New Facility in Dallas-Fort Worth

TuSimple opened a new facility in Dallas-Fort Worth to support the continued expansion of the TuSimple Autonomous Freight Network. The new facility will extend the company’s autonomous operations eastward, and allow for autonomous operations in the Texas Triangle, which includes DallasHoustonSan Antonio and Austin. The new purpose-built facility not only expands the company’s footprint, but will help meet the growing demand of shippers, carriers and fleets for access to safe, reliable and low-cost autonomous capacity.

The new facility serves as another milestone for TuSimple, placing the company six months ahead of schedule in completing the first of a three-phase plan to build a nationwide autonomous freight network by 2024. Today, TuSimple offers service between PhoenixTucsonEl PasoDallasAustinSan Antonio and Houston.

Pony.ai Regular Driverless Testing

Silicon Valley-based Pony.ai, a leading autonomous driving company, today announced that it has begun regular and daily fully driverless testing on public roads in Fremont and Milpitas, CA. Along with the recent launch of fully driverless testing in Guangzhou, China, the company has become the first to get fully driverless automated vehicles on public roads in three cities across the world’s two most dynamic mobility markets.

The successful kick-off is authorized by the previously announced driverless permit received from the California Department of Motor Vehicles for a fleet of six driverless vehicles, facilitating the total operational coverage over 100 square kilometers. Countless stealth technology iterations and numerous driverless readiness evaluations executed by a top-notch team have reinforced this significant milestone. As U.S. cities reopen in phases, Pony.ai looks forward to resuming its Robotaxi service to the public in Irvine, CA this summer, and plans to roll out the fully driverless service to the public in 2022.

“Going completely driverless is key to achieving full autonomy and an indispensable catalyst to realizing our ambitious vision,” said James Peng, CEO and Founder of Pony.ai. “As we continue to grow and scale, we extended our community responsibility from contactless delivery services throughout the pandemic in California last year to fight against the new COVID-19 outbreak in Guangzhou.”

Pony.ai has joined forces with the City of Fremont for over a year to combat COVID-19, including meal kit delivery service to vulnerable communities. Additionally, the company partnered with Yamibuy in Southern California to bring an autonomous and contactless last-mile delivery service to Irvine residents.

“In Guangzhou, a fleet of 14 driverless vehicles transports medical equipment, life supplies, and frontline medical workers to local communities day and night. Pony.ai always holds social responsibility in our hearts and puts community at the center of everything we do,” Peng added.

Sense Photonics Develops Wider Field of View

Sense Photonics, the world’s leading automotive flash Lidar solutions provider, announced  they have developed a system which delivers high resolution long-range capability and a mid-range capability with a wider field-of-view (FoV) simultaneously in a single shot using a single sensor.

Sense’s new MultiRange™ capability allows a vehicle to detect the road profile, road debris, and lane markings at long distances and also detect traffic in adjacent lanes without requiring multiple sensor heads.

“A comprehensive lidar solution in vehicles must be able to detect objects up to 200 meters directly ahead, while also being aware of cars changing lanes within 50 meters in front of the vehicle,” explained Hod Finkelstein, Sense’s CTO. “Legacy lidar systems can either deliver long range capabilities with fine angular resolution or mid-range imaging with a broad field-of-view, thus requiring OEMs to integrate multiple systems in the vehicle for complete coverage. These legacy solutions increase the overall system cost, complexity, and power consumption. With our unique flash architecture, we can deliver multiple fields of view with stunning clarity, efficiently and without data gaps — with a single sensor, a capability we call MultiRange,” stated Finkelstein.

Sense has been able to accomplish MultiRange performance by a combination of its proprietary VCSEL laser array, which houses tens of thousands of lasers on a single substrate, paired with its proprietary CMOS SPAD silicon receiver, which can detect single-photon-level detail with every return. Similar to a car’s headlights, Sense’s emitter technology uses a diffused beam to simultaneously illuminate the entire FoV. Global shutter acquisition allows for a consistently high-resolution point cloud across the FoV without motion artifacts.

MultiRange capability is currently being tested by leading automotive OEMs and autonomous vehicle solutions providers.  “Automotive companies expect high-performance systems but at a price that will scale,” stated Sense CEO Shauna McIntyre. “By delivering ultra-high resolution point clouds across multiple fields of view with a single sensor, we allow companies a comprehensive solution for both short and long-range sensor needs at a fraction of the cost,” explained McIntyre.

Hprobe New ATE

Hprobe, a provider of semiconductor Automated Test Equipment (ATE) for magnetic devices,  announced the demonstration of a new 3D magnetic generator design resulting in magnetic field accuracy of less than 5µT (5 microteslas) for wafer level probing of 3D angular magnetic sensors. Operating with automated wafer probing stations and external electrical testers, this major breakthrough offers significant throughput performance for testing advanced 3D magnetometers. The system presents high flexibility and compatibility with existing end-user platforms and high-volume manufacturing requirements. It is the latest evolution in Hprobe’s unique patented 3D magnetic field generator technology for single and multi-site testing at wafer level, under a magnetic field.

Advanced magnetic sensors are used for various automotive, consumer and industrial applications to extract positioning, angular, strength and direction information. They sense physical parameters using a magnetic field and transmit electrical responses for further processing. To validate these sensors for end applications, they must be tested under extremely demanding and accurate 3D magnetic fields.

Hprobe’s new 3D magnetic generator design can be integrated into the company’s existing equipment, which consists of a test head with a robotized 3D magnetic generator. It also includes a field calibration and monitoring system. It is built for interfacing with a currently available electrical tester or provided with a full tester as a turnkey solution. It comes with dedicated software for customers to implement their own tests and to generate custom 3D

Luminar Intros Blade

At its inaugural Studio Day in New York City, Luminar Technologies, Inc (Nasdaq: LAZR) introduced Blade, its vision for the future of design and integration of autonomous technology across robo-taxis, trucking and consumer cars. Luminar also showcased the first consumer vehicle fully integrated with Luminar’s Iris lidar, which is on-track for series production with Luminar’s OEM partners, starting in late 2022. The company is kicking off a global customer roadshow this week to demonstrate the performance, capabilities, and design integration of Iris.

Blade: The Autonomous Design Imperative

When it comes to automotive, harmony of form, function and technology is a paramount ideal for consumers and car makers. Blade is a first-ever concept and a powerful design expression of autonomous technology seamlessly integrated into cars, trucks and robo-taxis. It creates a foundation for a new vehicle architecture that automakers can incorporate into vehicle development programs from the onset. The Blade concepts unveiled today for robo-taxi and trucking represent a creative collaboration between Luminar and NewDealDesign, led by acclaimed technology designer Gadi Amit.

“To create the best car design and user experience, autonomous technology must be engineered and designed hand-in-hand from the ground up,” said Jason Wojack, Senior Vice President of Product Development at Luminar who honed his design-meets-engineering sensibility as the chief architect of the Motorola Droid RAZR, which revolutionized the slimness of phones. “Focusing on form and function at not just the lidar-level but the vehicle-level has enabled Luminar to spearhead the design integration of autonomous technology, which is among the fastest design breakthroughs in automotive history.”

Luminar’s Iris is the first autonomous technology designed to marry form and function: it combines performance, auto-grade robustness, scalability, and automotive aesthetic seamlessly. Iris was designed from the beginning to be cleanly integrated into the vehicle roofline, displacing the roof-rack style conglomerates historically seen on autonomous development vehicles and leapfrogging bolt-on products in development.

Luminar unveiled two Blade blueprints, which speak to the unique design and use case requirements for robo-taxi and trucking. Both concepts integrate the sensing technology into the roofline of the vehicle, creating an autonomous “blade.”

Luminar’s Blade Robo-taxi design imagines:

  • A sleek, roomy and inspiring car design for autonomous operation on highways and urban environments
  • Located for best performance, the golden ‘Blade’ runs across the crown of the vehicle, incorporating 4 Luminar lidars for 360-degree coverage
  • Built for consumers as well as ridesharing operations as lines between applications blur

Luminar’s Blade trucking design imagines:

  • A compact and seamless autonomous design integration
  • A three-lidar configuration for long-range sensing in all directions
  • Capability to retrofit onto existing class 8 trucks

Luminar remains focused on being the autonomous technology provider to the automotive industry. The new Blade designs give Luminar partners a reference for incorporating Luminar’s technology into their future vehicles and demonstrates the company’s commitment to not only delivering leading-edge technology but also beautiful design integration

Plus Partners with Good Machine

Plus (formerly Plus.ai), a global provider of self-driving truck technology, announced that they are partnering with Schmidt Futures-supported Good Machine venture studio on a pilot program to extend the sustainability impact that each company is working towards. Through this pilot, Plus’s automated trucks will help move equipment used for Good Machine’s sustainability efforts to address wildfires, food insecurity, illicit wildlife poaching, and illegal fishing. The partnership kicks off immediately with Plus’s autonomous truck hauling equipment from Winnemucca, Nevada to South San Francisco, California to be used for a wildfire detection project in California. Additional hauls will take place over the next year of the pilot program.

A key benefit of using Plus’s autonomous driving technology to haul goods on a semi-truck is the reduction of fuel consumption by 10% compared to the most efficient driver, which results in an equal decrease in carbon emissions. While Plus’s technology is poised to have a dramatic impact on the $4 trillion global truck freight market, each truckload is an opportunity to reduce the burden of trucking on the environment.

“Sustainability is part of the core mission for both Plus and Good Machine. By joining forces, we are creating a win-win-win for our companies and the environment. Plus is delighted to serve as an enabler of Good Machine’s inspiring sustainability efforts by using our fuel-efficient autonomous trucks to transport the equipment and supplies needed to launch these projects,” said Shawn Kerrigan, COO and Co-founder at Plus.

Good Machine has a broad portfolio of projects addressing globally devastating issues caused by climate change and marine pollution. It includes ReefGen, an underwater, dexterous, planting robot that is reviving marine ecosystems and coral tourism around the world. Fresure is a shipping container outfitted with solar panel energy to keep perishable foods cold during handling and storage, which reduces post-harvest losses and increases the available food supply to address food insecurity. The wildfire detection project aims to use stratospheric balloon technology to detect fires early and report them to relevant authorities to help reduce catastrophic damage. Good Machine and its portfolio companies work with a number of partners, including Johns Hopkins University, Minderoo Foundation, National Science Foundation, the Nature Conservancy, Schmidt Futures, Wildlife Conservation Society, WorldFish, and others.

Investment in Ridecell

Woven Capital, L.P. announced that it has made an investment in Ridecell Inc., a leading platform powering digital transformations and IoT automation for fleet-driven businesses. Woven Capital is an $800 million global investment fund that supports innovative, growth-stage companies in mobility, automation, artificial intelligence, data and analytics, connectivity, and smart cities. It is the investment arm of the Woven Planet Group, a Toyota subsidiary which is dedicated to building the safest mobility in the world. Along with the investment, Ridecell and the Woven Planet Group will explore collaborative opportunities in mobility service operations.

Ridecell’s automation and mobility platform allows businesses to create a unified view of their vehicle fleets from deployed telematics, fleet management, and internal systems and use the consolidated insights to automate business and operational workflows. The platform furthers digital transformation and turns data overload into automated and efficient business operations.

Ford & Argo AI Issues Report

Back in 2018, Ford issued a comprehensive report to the U.S. Department of Transportation (DOT) that outlined the way we approach self-driving vehicle development. A lot has changed in the years since, but our purpose has remained rock solid: to use autonomous technology to help make people’s lives better by providing a safe, trusted and affordable mobility solution.

Since they released that first safety report, they have continued bringing together all the complex pieces needed to launch a self-driving service. In addition to working with Argo AI to advance the development of a robust Automated Driving System to guide our vehicles on the road, we’ve continued to research and develop an exceptional customer experience, our fleet management capabilities, forward behind-the-scenes transportation-as-a-service software and more.

To capture these developments and maintain our goal to be transparent, we’ve recently updated our voluntary safety self-assessment, A Matter of Trust 2.0 and shared it with the U.S. DOT. Here’s a sampling of the advancements we’ve made in the last two and a half years, which you can read about in detail in the report.

New launch markets: In addition to Miami, Ford announced plans to launch its self-driving service in Washington, D.C. and Austin, Texas. In all three of these cities, we’ve established robust testing and business operations, including terminals and command centers that will allow us to manage our fleet of vehicles as they transport people or deliver goods.

The great Escape: Its newest self-driving test vehicles are built on the Escape Hybrid platform, taking advantage of increased electrification capabilities and featuring the latest in sensing and computing technology. The Escape — with a modified exterior and interior designed to deliver a great customer experience — is also the type of vehicle we will initially launch our service with.

Expanded testing and pilots: Alongside testing in Miami, Austin and Washington, D.C., Argo AI continues to test the Automated Driving System in Detroit, MI; Pittsburgh, PA; and Palo Alto, CA — what may be the largest, most-diverse active urban-testing footprint of any self-driving vehicle developer in the U.S. We have also begun integrating our self-driving test vehicles directly into our business pilots, giving us real-world insights into what is required to run an efficient self-driving business.

Continued Collaboration with Cities: Every city has unique transportation needs, and we remain committed to continuing to work closely with the cities where we operate. We want to be a part of the city’s transportation system and provide a service that helps make people’s lives better. An example of our collaboration is our Ford-designed smart infrastructure in Miami. We worked closely at the city, county and state level to begin researching complex intersections. While our self-driving vehicles will be fully capable of safely navigating the streets on their own, we are looking at how we can provide self-driving vehicles with even more information before they approach busy or tricky intersections, giving them additional context about the activity ahead.

COVID impact: The global pandemic has had an effect on nearly everything we do, and our self-driving business is no exception. Now more than ever, people are looking for a safe and sanitary environment to interact with, and we’re working hard to help ensure the right processes are in place to meet that expectation whether customers are hailing a ride or receiving a delivery.