Autonomous and Self-Driving Vehicle News: Robotics, Timelines & Reports

This week like every week there is more autonomous and self-driving vehicle news with many new announcements for products, timelines and reports.  Companies featured include Ouster, Outsight, NVIDIA, Brodmann17, Horizon Robotics, Aptiv, NACTO, TuSimple and PTOLEMUS research.

Ouster Works it Out with NVIDIA

Ouster, a leading provider of high-resolution lidar sensors for autonomous vehicles, robotics, and mapping, is working with NVIDIA to provide lidar sensors for use in Level 3 to Level 5 autonomous driving systems being developed by major global OEMs targeting production in 2022. The lidar perception system, based on the high-resolution Ouster OS2 lidar, runs on the NVIDIA DRIVE AGX platform. NVIDIA’s full-stack software delivers 360-degree sensor processing, mapping, and path planning.

In anticipation of advanced series vehicle production in 2022, NVIDIA will use Ouster lidar sensors for development as it works with OEMs to bring safe, reliable autonomous vehicles to market. Ouster lidar sensors satisfy multiple use cases across commercial autonomous vehicle applications utilizing the NVIDIA DRIVE platform, including public transportation, freight, refuse and recycling collection, construction, mining, and more.

Brodmann’s Deep Learning Neural Network

Brodmann17, a leading Tier 2 automotive software supplier, announced the launch of the world’s first automated deep learning training platform, specifically designed for automotive-grade ADAS/AD solutions. The platform seamlessly trains and deploys deep learning neural network models for ADAS/AD solutions, automating a process that cannot be done at scale manually. The platform protects against human error and lowers associated risks, while reducing time to market and costs. Additionally, the platform solves a major industry pain point by enabling automakers and Tier 1 suppliers to collaborate with AI companies in the process of developing neural networks.

Brodmann17 developed this comprehensive platform to optimize the entire neural network training process: the data selection, parameter tuning, neural network architecture search (NAS) and deployment to target embedded processors for runtime benchmark and error analysis. By optimizing the process as a whole, as opposed to optimizing each stage separately, a much better neural network can be achieved to meet customers’ given requirements. The platform was initially developed as an internal neural network production line to scale Brodmann17’s operations and improve the highly complex process of training and deploying neural networks. The platform will now be available for use by select Brodmann17 customers.

Deployed on different clouds, the automated platform enables OEMs, Tier 1 suppliers and any other Brodmann17 customer to upload their data and run the training process without exposing training videos to the outside. In doing so, the platform protects customers’ private data.

Horizon Robotics 2nd Gen AI

Horizon Robotics, leader of edge AI computing, unveiled its 2nd generation automotive AI processor – Horizon Journey™ 2 to the international public. Founded in 2015, Horizon Robotics has enabled Tier1s and OEMs across North America, Europe and Asia to develop advanced deep-learning solutions that power the future of the automotive industry.

Journey™ 2 is one of the first AEC-Q100 compliant deep learning compute options available to Tier1s and OEMs looking to enable the next generation ADAS capabilities and intelligent cockpit experiences with limited power consumption and excellent performance and efficiency.

The processor boasts Horizon Robotics’ proprietary highly efficient programmable deep learning computation cores – BPU, dual-core ARM Cortex-A53 and dedicated image signal processor. Thanks to the dedicated BPU cores which dominate 60% of the silicon footprint, Journey™ 2 provides powerful deep learning performance of 4 TOPS with typical energy consumption at 2W*. It is built on TSMC’s stable 28nm HPC+ process.

Journey™ 2 is able to process 4K video inputs at 30 frames per second and complete the parsing of 23+ semantic categories, 2D+3D detection of hundreds of objects, distance and speed estimation and other key perception features to meet Euro NCAP 2022 requirement. Horizon Robotics also offers to its customers highly optimized production-ready perception software for ADAS and autonomous driving as an option for turn-key perception solutions based on Journey™ 2.

Along with Journey™ 2, Horizon Robotics is also launching the Horizon OpenExplorer AI Toolkit which allows customers to deploy custom deep learning networks on Journey™2 to leverage its powerful compute performance. The OpenExplorer AI Toolkit also includes design examples and best practices to significantly improve time-to-market for customers.

Journey™ 2 have also already attracted many lead customers. SK Telecom, South Korea’s leading telecommunication company, is currently in the process of equipping tens of thousands of vehicles with JourneyTM 2 to enable crowd-sourced HD map updates as well as ADAS functions, paving the way towards safer and more automated driving. Several leading automotive OEM & Tier-1 customer around the world have also awarded design wins to Journey™ 2; with production vehicles equipped with Journey™ 2 to begin mass production as early as 2nd Quarter of 2020.//

Aptiv and the four phases of autonomous driving.

Aptiv wrote on its blog that it has come a long way down the road toward fully autonomous vehicles. For example its partnership with Lyft in Las Vegas, for example. Its autonomous cars successfully completed 70,000 rides in their first year of service, racking up a nearly perfect 4.95 user rating along the way.

As this momentum builds — both within Aptiv and the automotive industry as a whole — they see a four-phased road map leading to a world of fully autonomous vehicles, zero emissions, and, ultimately, zero traffic fatalities. A closer look at each phase in the timeline underscores how Aptiv is uniquely positioned to help our customers not just successfully reach each milestone, but move onto the next one a lot faster than they may think possible.

Aptiv is getting a solution that works 95% is a relatively small part of the effort, while making sure that you have a technical foundation that enables you to solve the last 5% is where the real value lies at this phase.

Let’s jump ahead to 2022, when small but viable autonomous driving programs not requiring a safety driver make their debut. Of course, these programs will take place in limited geo-fenced areas and under the strictest of operating conditions, but within these carefully defined areas, we will see fully autonomous, safety driverless vehicles helping people get from one place to another, safely and with minimal impact on the environment.

The pace of innovation in the autonomous vehicle game is driving us to a place where we stand to see much broader adoption by as early as 2025. Operational Design Domains (ODDs), which identify the specific operating domain in which fully autonomous functionality is designed to properly operate, will expand greatly in this phase

Don’t blink. 2030 marks the year that both software and hardware will be robust enough to handle the corner cases that, before then, made viable personal vehicle application impossible. That’s right, in a little more than a decade, we will see OEM vehicles coming off the assembly line fully configured for conditional automated driving.

NACTO Policies for AVs

(Toronto) The National Association of City Transportation Officials (NACTO) today released the second edition of the Blueprint for Autonomous Urbanism, focusing on the near-term policies and decisions that are necessary for autonomous technologies to improve transportation outcomes, rather than lead to an overall increase in driving, greenhouse gas emissions, and diminished public space in cities. Developed with a steering committee from NACTO’s 81 member cities and transit agencies, and based on the sweeping vision for city streets of the future in the first edition of the Blueprint, the second edition details the concrete steps that will need to be taken to ensure an equitable, people-first city.

Key policies areas include:

  • Transit: Cities should commit to prioritizing high-quality on-street transit, and use technologies available today, such as computer-aided dispatch and automatic vehicle location systems, to improve efficiency and create services that attract riders. Future automated technologies, already in use in many rail systems, can enable transit agencies to expand service, serving more people for the same operating cost.
  • Pricing: Autonomous vehicles could make driving easier and cheaper than it is today. Increased mobility will provide multiple benefits but, absent policy mechanisms and incentives to encourage people to use the most efficient modes, traffic and pollution, already at crippling levels in many cities, will continue to increase. Congestion pricing today, paired with new technologies that could allow governments to gauge traffic in real time and accurately price travel demand, is crucial to influencing travel behavior.
  • Data: As an increasing amount of data is collected in the public realm, ensuring that data is appropriately collected and protected by cities and companies alike is an increasingly urgent concern. With a coordinated approach to enhancing asset data, using open data specifications, and updating data management policies, public agencies can ensure access to crucial planning data while strengthening privacy protections for individuals.
  • Urban Freight: Cities must develop sophisticated urban freight policies that group deliveries to reduce the number of freight trips and increase efficiency and safety. While autonomous technology could make long-distance freight movement much more efficient, these technologies could also be detrimental within cities, flooding streets, sidewalks, and airspace with bots and drones. Downsizing vehicles, managing the curb, and keeping human workers for the last 50 feet of deliveries will be essential to managing the increasing volumes of goods movement within cities.

The second edition of the Blueprint for Autonomous Urbanism also builds on the street designs envisioned in the first edition, outlining designs, policies, and tools to enable safe, frequent crossings, more sustainable and efficient use of the street, and a more vibrant pedestrian realm. It also includes more detailed sections on the actions individual city departments, from IT to fleet services to parking authorities to employment and administrative services, should take to prepare as automated technology becomes more mature. Traditional and emerging areas of authority, across various levels of governments, are explored, as are the effects of AVs on employment and labor in cities.

Development of the Blueprint for Autonomous Urbanism was supported by Bloomberg Philanthropies and ClimateWorks Foundation. It is available as a free download at nacto.org/blueprint.

TuSimiple Funded

TuSimple, a global autonomous driving company, announced today that it has secured commitments for an additional $120 million in Series D funding. These funds are part of an extended Series D round that was oversubscribed and reached a total of $215 million. The round includes previously announced investor UPS, the global logistics leader. New participants in the round include CDH Investments and leading automotive tier-1 supplier Mando Corporation.

Outsight 3D Semantic Camera

Outsight, a new entity formed from Dibotics, a pioneer in Smart Machine perception and developer of real-time processing solutions for 3D data – launched its new technology at Autosens – the world mecca of vehicle automation and self-driving cars to show the first applications of their new technology. Outsight has revealed its 3D Semantic Camera that takes world perception and comprehension to new levels for autonomous driving and other industries.

Outsight’s 3D Semantic Camera will not only be able to bring Full Situation Awareness and new levels of safety/reliability for currently man-controlled machines like Level 1- 3 ADAS (Advanced Driving Assistance Systems), Construction/Mining equipment, Helicopters, etc, but also accelerate the emergence of fully automated Smart Machines like Level 4- 5 Self Driving Cars, Robots, Drones, Autonomous flying taxis etc.

The technology is the very first of its kind to be intended to provide Full Situation Awareness in a single device. It’s a mass-producible, “all in one solution” technology with the ability to simultaneously perceive and comprehend the environment from hundreds of meters, including the key chemical composition of objects (Skin, Cotton, Ice, Snow, Plastic, Metal, Wood…).

This is partly made possible through the development of a low powered, long range and eye-safe broadband laser that allows for material composition to be identified through active hyperspectral analysis. Combined with its 3D SLAM on Chip(R) capability (Simultaneous Localization and Mapping), Outsight’s technology is able to unveil the Full Reality of the world in real-time.

Outsight’s 3D Semantic Camera is capable of providing actionable information and object classification through the onboard SoC (System on a Chip) that does not rely on “Machine Learning”, resulting in lower power consumption and bandwidth needed. This new approach eliminates the need for massive data sets for training and the guesswork is eliminated through actually “measuring” the objects. Being able to determine the material of an object adds a new level of confidence to determine what the camera is actually seeing.

It’s able to not only see and measure, but comprehend the world, as it provides the position, the size and the full velocity of all moving objects in its surroundings, providing valuable information for path planning and decision making.

The 3D Semantic Camera can provide important information regarding road conditions and can, for example, identify black ice and other hazardous road conditions. This feature is vital for safety in ADAS systems for example. The system can also quickly identify pedestrians and bicyclists through its material identification capabilities.

Outsight has already started joint development programs with key OEMs and Tier1 providers in Automotive, Aeronautics and Security-Surveillance markets and will progressively open the technology to other partners in Q1-2020.

Cocktail Lidar Report Costly $10,000

A new study by PTOLEMUS Consulting Group found that most OEMs will use a “cocktail” of technologies, with LiDAR being key, to achieve full vehicle automation. This is instead of relying entirely on radars and cameras for vision, which is an approach championed by companies such as Tesla.

PTOLEMUS predicts that, despite significant decreases, the cost for OEMs to achieve safe operation of full autonomy (SAE level 4) will still exceed $10,000 per vehicle in 2022, making it near-impossible to launch fully autonomous private cars below the $100,000 price tag, and prohibiting mass roll-out.

PTOLEMUS expects robotaxis to be launched in 2021, as their 24/7 driverless operation capability will allow ride hailing operators such as Waymo to recoup the investment. Then, the mass market will see the introduction of L4 tech such as ‘Automated Valet Parking’ and ‘Highway Drive’ in premium models.

Frederic Bruneteau, PTOLEMUS’ Managing Director, stated: “Two years ago, most OEMs were adamant that sensors and AI would suffice. But high profile accidents have pushed the safety imperative, requiring extra layers of redundancy. We predict that a ‘good enough’ approach to automation will never be authorised by regulators worldwide.”

The Autonomous Vehicle Technology & Supplier’s Global Study is the world’s first cross-technology, cross-supplier assessment of the AV industry. It leverages research of 80 tech companies and interviews with more than 20 critical vendors and answers such questions as:

  • Which technologies will be used by OEMs?
  • Who is the world’s foremost autonomous technology supplier?
  • How will AV technologies evolve?
  • What will be the cost evolution of key sensors, maps and components?
  • When and how will self-driving cars be available?

Read all autonomous vehicle news.

SUBSCRIBE

You are welcome to subscribe to receive emails with the latest Autonomous Self-Driving Driverless and Auto-Piloted Car News , you can also get weekly news summaries or midnight express daily news summaries.