Autonomous & Self Driving Vehicle News: Velodyne,  Sensor Fusion, Foresight, NVIDIA, Luminar & Daimler Trucks

In autonomous and self-driving vehicle news are Velodyne,  Sensor Fusion, Foresight, NVIDIA, Luminar and Daimler Trucks.

Velodyne Intros Velarray H800

Velodyne Lidar, Inc. debuted its latest innovation in its broad array of lidar sensors, the Velarray H800. The solid state Velarray H800 is architected for automotive grade performance and built using Velodyne’s breakthrough proprietary micro-lidar array architecture (MLA). With combined long-range perception and a broad field of view, this sensor is designed for safe navigation and collision avoidance in advanced driver assistance systems (ADAS) and autonomous mobility applications. The Velarray H800’s compact, embeddable form factor is designed to fit neatly behind the windshield of a truck, bus or car, or be mounted seamlessly on the vehicle exterior. The Velarray H800 will be available at high-volume production levels with a target price of less than $500 to drive broad adoption in consumer and commercial vehicle markets.

The Velarray H800 boasts outstanding range, field of view and resolution to support advancements in autonomy and ADAS, from Level 2 to Level 5. This spans the entirety of ADAS features from Lane Keep Assist and Automated Emergency Braking, all the way to the top levels of automated driving. The compact, low cost sensor can be paired with Velodyne’s Vella software suite, enabling the full spectrum of safety features.

Sensor fusion is an essential prerequisite for self-driving cars, and one of the most critical areas in the autonomous vehicle (AV) domain. Now it has become clear that a common agreement or standard can help to reduce liability risks and the risk of wrong development. OEMs, Tier 1 and Tier 2 suppliers could also benefit from more efficient collaboration. This is what renowned industry experts discussed at the latest virtual Chapter Event by The Autonomous and co-host BASELABS.

“Sensor fusion flaws can jeopardize the safety of a self-driving vehicle’s entire system,” said Chairman Ricky Hudi after The Autonomous’ Chapter Event. “The state of sensor data fusion does not yet provide a reliable basis for safe AV – more has to be done. Industry-wide collaboration could help lift advanced driver assistance systems and later autonomous driving to the necessary safety level.”

“We must make automated vehicles safe under all conditions,” says Robin Schubert, Co-founder and Managing Director of BASELABS. “Therefore, we support The Autonomous: we believe in cooperation in the industry, want to exchange best practices and experiences and work together to design reference solutions that later can also be accepted by standards. We see a clear need for an agreement on how we can prove that sensor data can be reliable.”

Standardization of data fusion was a key discussion point among the experts who also took questions from the audience.

Cornelius Bürkle, Research Scientist at Intel Corporation, was convinced that safety should not be part of competition. From his perspective, industry and organizations should agree on which models in system architectures are needed and what degree of safety they should cover. This could help decide on the trade-off between the availability of sensor data and safety.

Carlo van Driesten, Systems Architect for Virtual Test & Validation at BMW, explained during the discussion that standardization should not prevail over innovation. But open standards could be helpful to inspire and support innovation. In his talk, he presented the ASAM Open Simulation Interface (OSI) published by BMW as a GitHub open source project as a best practice example. The challenge now is to further develop the Open Scenario Standard.

According to Alexander Scheel, Sensor Fusion Engineer for Automated Driving at Bosch, there is consensus in different research communities when it comes to KPIs, for example. But there is little or no exchange across disciplines such as classical tracking and the growing machine learning community. Bert Auerbach, Vice President at FDTech, mentioned that in the area of testing, requirements are often not fully known to suppliers. He suggested that the Operational Design Domain (ODD) for the model must be defined specifically. Ronny Cohen, CEO of LeddarTech, said because there is no SAE Level 3/4 system yet, it might be too early to talk about standards for it, but the industry is coming closer to this and is probably near to standards for SAE Level 2 systems.

Marcus Obst, Head of Business Development at BASELABS, presented possible benefits for OEMs and Tier 1 and Tier 2 suppliers of a standardized data fusion architecture.

Standards could make supplier offers comparable for OEMs and help to formalize the offering process. On the other hand, suppliers would need much less tailored engineering of existing components for the input and output interfaces. A once standardized data fusion architecture under ISO 262626 and ISO/PAS 21448 would eliminate the need for further standardization by multiple stakeholders. As already applicable for SAE Level 2 and 2+, this could be a first step to build on for higher SAE levels.

Read more about the further discussion on data fusion at this Chapter Event on The Autonomous website.

Foresight Autonomous Integrates QuadSight with NVIDIA Jetson Xavier

Foresight Autonomous Holdings Ltd. an innovator in automotive vision systems, announced that it has completed integration of its QuadSight® software on the NVIDIA® Jetson AGX Xavier™ platform, suitable for shuttles, agriculture, heavy equipment machines and more.

Moreover, Foresight is now part of NVIDIA Inception, an acceleration platform that offers go-to-market support, expertise, and technology assistance to artificial intelligence (AI) and data science startups transforming industries using NVIDIA graphics processing unit (GPU) accelerated solutions. Foresight also continues its software integration efforts with the NVIDIA DRIVE platform to help accelerate development of autonomous vehicles (AVs).

The NVIDIA Jetson AGX Xavier platform enables Foresight to run its stereoscopic obstacle detection software, composed of both a visible-light and an infrared channel, designed to provide accurate information about any object in harsh weather and lighting conditions. The NVIDIA platform also allows Foresight to perform fusion between the visible-light and infrared channels, generating an accurate depth map while reducing false alerts. Jetson AGX Xavier offers outstanding AI capabilities with big workstation performance in a small form factor (32 trillion operations per second at 30 watts), making it ideal for autonomous machines such as industrial and heavy equipment vehicles.

Foresight also uses NVIDIA DRIVE, an open and scalable AV hardware and software platform that spans from the cloud to the car and back. This end-to-end software-defined AI solution allows Foresight to further develop its technology for passenger vehicles and offer enhanced perception, localization and 3D mapping capabilities that are key for safe autonomous and highly automated driving.

Luminar Tech Partners with Daimler Truck AG

Luminar Technologies, Inc., the global leader in automotive lidar hardware and software technology, and the world’s largest commercial vehicle manufacturer, Daimler Truck AG announced a strategic partnership to enable highly automated trucking, starting on highways. Experts at Daimler Trucks, its U.S. subsidiary, Daimler Trucks North America (DTNA) and Torc Robotics, part of Daimler Trucks’ Autonomous Technology Group, with the experts at Luminar are collaboratively pursuing a common goal of bringing series-produced highly automated trucks (SAE Level 4) to roads globally. The teams will work closely together in order to enhance lidar sensing, perception, and system-level performance for Daimler trucks moving at highway speeds. To strengthen the partnership, Daimler Trucks has acquired a minority stake in Luminar.

The autonomous trucks are expected to yield dramatic improvements in efficiency and safety of logistics, with an initial focus on long-haul routes on highways. This constrained application of autonomy enables the technology to be commercially deployed in series production on nearer term time frames compared to urban autonomous driving development.