Autonomous and Self-Driving Vehicle News: dRISk, AIMotive, NextChip, Local Motors, Velodyne, Ansys, Continental & Mobileye

In autonomous and self-driving vehicle news are dRISK, AIMotive, NextChip, Local Motors, Velodyne, Ansys, Continental and Mobileye.

dRISK Detects High-Risk Events for Autonomous Vehicles

dRISK, a London and Pasadena-based startup that has so far been in stealth, announced its launch and that it has – for the first time commercially – employed its edge case retraining tool to achieve a 6x performance in time to detect high-risk events for autonomous vehicles (AVs).

Whereas semi-autonomous and autonomous vehicles currently do not always detect high-risk events in time to react to them (oncoming cars peeking into the lane from behind other vehicles, vehicles running red lights concealed by other cars), dRISK’s tools for retraining AVs to recognize edge cases represent a dramatic step forward in the ability to retrain autonomous vehicles to well outperform humans at even the trickiest driving scenarios. The results were formally presented at NVIDIA’s GTC conference on April 12, 2021.

dRISK’s mission is to help make AVs dramatically safer as soon as possible. The new patented knowledge graph technology (analagous to Google’s knowledge graph of the internet, but in dRISK’s case a knowledge graph of real-world events) solves a number of problems which have plagued AV developers so far — encoding massively high-dimensional data from all the different relevant data sources into a tractable form, and then offering the full spectrum of edge cases so as to retrain on not just with what has already happened but will happen in the future.

dRISK delivers simulated and real+simulated edge cases in semi-randomized, impossible-to-game training and test sequences, to achieve superior testing and retraining results for customers on real-life data. Unlike traditional training and development techniques, in which AVs are trained to recognize primarily whole entire vehicles and pedestrians under advantageous lighting conditions, with dRISK’s edge cases AVs are trained to recognize just the predictors of high-risk events (e.g. the headlights of an oncoming car peeking into the lane amid low-visibility). AV systems trained this way can recognize high-risk events sooner, without a significant decrease in performance on low-risk events.

dRISK’s customers include AV developers, major transport authorities and one of the world’s largest insurers, all of whom have an interest in mitigating AV risk and improving AV performance. dRISK has so far delivered semi-customized solutions on an individualized basis, but intends to release a version of its AV retraining product directly on the web later this spring to open up these capabilities to the wider AV community.

dRISK Inc and wholly-owned subsidiary dRISK.ai Limited, along with its partners in the UK-based D-RISK consortium, Imperial College London’s Transport Systems Laboratory, Claytex and DG Cities, received £3M in funding from the UK’s Centre for Connected and Autonomous Vehicles to develop the ultimate driver’s test for the self-driving car.

In addition to its government funding, dRISK raised seed funding in a closed round led by Okapi Ventures, with Netsu Equity Ltd, Poetic Partners, SaaS Ventures and Mount Wilson Ventures. dRISK holds 4 patents on its technology, with two additional patents pending. dRISK’s CEO is Chess Stetson, Ph.D. (Computation and Neural Systems, Caltech) and its team boasts talent from Stanford, Berkeley, Harvard, Oxford, Cambridge, Imperial College London and NASA. See more here: dRISK’s “Risk Aware Perception”.

AIMotive & NextChip Co. Demoed NN

AImotive, the world-leading supplier of scalable automated driving technologies and Nextchip Co., Ltd., a dedicated automotive vision technology company, announced that AImotive has successfully demonstrated automotive NN (Neural Network) vision applications executing at up to 98% efficiency on the aiWare3P™ NPU (Neural Network Processor Unit) used on Nextchip’s latest Apache5 IEP (Imaging Edge Processor). Also featuring an advanced ISP supporting imaging sensors up to 5.7Mpixel resolution, quad-core Arm A53 CPU and small package size of only 9mm by 9mm, Nextchip’s Apache5 is designed for demanding automotive edge vision applications to full AEC-Q100 Grade 2.

Thanks to rapid bring-up of all key functions on Apache5, complemented by aiWare Studio’s unique offline NN optimization tools, AImotive and Nextchip were able to demonstrate to lead customers the Apache5 IEP executing demanding automotive AI applications using the aiWare NPU within weeks of receiving first samples. These confirm that Apache5’s aiWare3P 1.6 TOPS NPU can deliver up to 98% sustained real-time efficiency for a wide range of NN workloads. Only minimal optimization effort was required, which was completed prior to receipt of the first devices.

“We are excited that thanks to the close collaboration between Nextchip and AImotive, we have been able to demonstrate Apache5 executing compelling automotive AI applications with exceptional efficiency within weeks of receiving first samples” said Young Jun Yoo, CMO at Nextchip.

“With Apache5 we have demonstrated that we can deliver 2x to 3x higher CNN performance for the same claimed TOPS of other NPUs” said Márton Fehér, senior vice-president hardware engineering for AImotive. “Furthermore, our aiWare Studio SDK enabled our aiDrive team to bring up multiple well-optimized NNs within days of receiving Apache5 silicon”.

Local Motors Moves Packages at Miramar

Local Motors, a leader in the development, manufacturing, and deployment of autonomous shuttles, recently began the first-ever project moving packages around Marine Corps Air Station (MCAS) Miramar using an autonomous electric shuttle, Olli.

The project is in collaboration with US Ignite, an accelerator of smart communities, and NavalX SoCal Tech Bridge as a part of the 5G Living Lab at MCAS Miramar. Funded through the Office of Naval Research and Naval Information Warfare Center (NIWC) Pacific, this $4 million initiative supports multiple pilot projects aimed at improving military base operations through technology innovation.

MCAS Miramar personnel will be directly engaged in shuttle operations to gain autonomous vehicle experience and evaluate the benefits of autonomous technology in potential base and deployment applications. Data from the Olli shuttles will be transferred via 5G network and analyzed beyond the duration of the 90-day pilot program.

“MCAS Miramar’s deployment of the Olli demonstrates the dual-use opportunity of autonomous electric vehicles on installations to transport both people and goods. It highlights the importance of early engagement with companies and how fostering a collaborative effort provides efficient dual-use technologies,” said Lieutenant Colonel Brandon Newell, NavalX SoCal Tech Bridge Director.

Local Motors CEO Jay Rogers said, “Local Motors is proud to develop and deploy made-in-America vehicles, and it is an honor to serve our military. Our Direct Digital Manufacturing (“DDM”) clock-speed allows us to accelerate collaboration, deployment, and testing of dual-use technologies. The results of this project will be critical to improving base operations and future on and off-base deployments.”

US Ignite is supporting the launch and project management of this pilot. “It’s critical to be able to field test new technologies and evaluate them under real-world conditions,” said Eric Werner, Director of Autonomous Vehicle Programs at US Ignite. “As the military continues to upgrade and optimize base operations, pilot programs like this one will provide crucial insights on technology performance and the potential for automated vehicles to improve services for base personnel. We are pleased to be working with Local Motors on this project, and to be forging ahead with such an innovative program at MCAS Miramar.”

Velodyne Collaborates with Ansys

Velodyne is collaborating with Ansys to integrate an encrypted ‘black box’ physics-based lidar sensor model into Ansys® VRXPERIENCE, a next-gen, real-time interactive driving simulator that models, evaluates and validates lidar designs within a highly realistic virtual environment. This end-to-end capability empowers engineers to rapidly model countless edge case driving scenarios across millions of miles and substantially reduce physical tests. As OEMs integrate Velodyne’s lidar into their ADAS portfolio, VRXPERIENCE will reduce development costs by enhancing lidar placement within AVs and validating AV performance.

“Ansys VRXPERIENCE supports faster development and deployment of ADAS solutions using Velodyne’s lidar by providing a fully immersive environment to test and improve hazard identification capabilities,” said Anand Gopalan, CEO at Velodyne Lidar. “Velodyne’s focus on safety aligns with Ansys strengths in enabling informed design decisions. Our collaboration helps engineers virtually run their ADAS applications in challenging roadway conditions so they can build solutions that achieve safe navigation and collision avoidance.”

“As part of Ansys’ AV ecosystem, Velodyne is helping to define the landscape of safe autonomous driving. Velodyne’s leading-edge automotive lidar greatly increases the safety and reliability of ADAS, powering highly intelligent AVs that improve decision making across many complex edge case scenarios,” said Prith Banerjee, chief technology officer at Ansys. “Using VRXPERIENCE, OEMs will validate the lidar’s software stack and have full access to a validated sensor model, while preserving Velodyne’s IP. This will enable Velodyne to rapidly and cost-effectively design trailblazing lidar sensors and significantly speed delivery to market.”

On April 20th and 21st, Velodyne will present “How Lidar Sensors, Software and Simulation Advance Autonomous Applications” at Simulation World 2021. The presentation will be available live and on demand. To register, please visit: https://www.simulationworld.com/.

In Japan Panasonic Delivers Medical Supplies

In Japan, there are not enough delivery workers and lifestyles are changing, giving way to new challenges. The Fujisawa Sustainable Smart Town, located in Fujisawa City, Kanagawa Prefecture, is drawing on its collective power to address these challenges. A new initiative that just began is a cutting-edge field experiment using compact delivery robots that will deliver products to people’s homes. Let’s hear what the people involved in this project have to say.

An ever-evolving town perfect for testing state-of-the-art technologies

The Fujisawa Sustainable Smart Town (FSST) was established in 2014. Panasonic played a central role in bringing to life this town measuring approximately 19 hectares built on its old factory site in Fujisawa City, Kanagawa Prefecture where 2,000 people currently live.

FSST aspires to create a “100-Year Community.” And in order to ensure its sustainable development, the town established its own management company, the “Fujisawa SST Management Company.” Working with the next generation self-governing body, “Fujisawa SST Committee,” which has enlisted in this shared concept, the management company is helping to bring about resident-led town building. Moreover, 18 organizations including the city of Fujisawa that have concurred with the concept have formed a consortium, the “Fujisawa SST Council” to help invigorate the city.

The Fujisawa SST Council is made up of partners from a diverse range of fields including education, medicine, logistics, energy, homes, real-estate, and media. Since the grand opening in 2014, FSST has worked with various partners to run multiple PoCs (Proof of Concept) in this living, ever-changing environment. Panasonic’s Takeshi Arakawa, who serves as the President of the Fujisawa SST Management Company and Chairman of the Fujisawa SST Council stated, “We have conducted field experiments for new technologies, new businesses, and marketing initiatives almost every year. FSST residents are very understanding of and cooperative towards these leading-edge initiatives.”

One of these field experiments that has been running since 2020 is the delivery service for homes using Panasonic’s autonomous delivery robot. For this service, Panasonic is using the autonomous-driving robots it has developed over the years and the know-how it has gained through the ride share service it has been operating within its premises.

During phase 1, from November to December 2020, the robot was tested on public roads. Panasonic analyzed the data gained from this experiment and made improvements, and from March 5, 2021, a new experiment to deliver medication to patients’ homes from the pharmacy in town has begun. Mr. Arakawa explained the reason why they decided to focus on the delivery robot.

“Fujisawa City is part of the ‘Robot Town Sagami’ (special zone for robotics industry in the Sagami area) specified by the Kanagawa Prefectural Government, and so the local government is very keen to find applications for robots. Then COVID-19 happened, and people now want to avoid face-to-face interaction and contact when exchanging goods. Moreover, the demand for EC and delivery has grown rapidly. Having said that, the delivery industry is faced with chronic labor shortage. In order to resolve the issues we are facing in the new normal, we decided to accept the autonomous delivery robot trial.”

The rapid implementation of delivery services using autonomous delivery robots is also highlighted in the national government’s growth strategy action plan, and Panasonic is also taking part in the Ministry of Economy, Trade and Industry sponsored public-private council for the promotion of deliveries by autonomous robots. The demonstration experiment in question is also being supported by NEDO (New Energy and Industrial Technology Development Organization), so it has received the support of society as well. Such favorable conditions together with FSST’s open-mindedness to new challenges made it an optimal testing ground for this experiment.

When Panasonic contacted the Fujisawa SST Committee about the autonomous delivery robot field experiment, many replied, “Yes, please!”

Mr. Arakawa added, “The community has been open to various field experiments, whether they involved robots or not. We make it a point not to force these experiments on the community. We believe it is important to discuss the challenges that the town is facing together. The Fujisawa SST Management Company is a hub and FSST a platform that matches the needs of its residents to those of companies.”

It will be first time in Japan that an autonomous delivery robot will travel on public roads in a city. The robot is quite compact and comes equipped with a box for storing products for delivery. It is programmed to avoid obstacles by using sensors and cameras, but if it becomes difficult to run autonomously, the robot can be remotely controlled from the operation center. So, Panasonic has placed safety and security first and foremost.

Mr. Arakawa went on to explain, “Robots were not really a familiar presence in the past, but seeing them in town, people have begun to recognize them as a friendly presence. When people become interested, they will begin to think about what it would be to live alongside a robot and offer feedback. I think that it is also playing a role in activating the community.”

Changing the way things are transported in the community

During phase 2 of the field experiment, the robots will cover a larger area, and we will see how the service will be accepted by the public. As part of those efforts, AIN HOLDINGS INC., a member of the Fujisawa SST Council, has offered its wholehearted support to allow autonomous delivery robots to deliver medication from the “AIN Pharmacy Fujisawa SST Shop.” This also includes medication for which online medication counseling has already been provided. So, it is a value-added service that is appropriate for lifestyles of the new generation.

Suni Kim, from the Corporate Planning Section of AIN HOLDINGS, explained the company’s position. “AIN HOLDINGS was the first company in Japan to field test online medication counseling in a national strategic zone. And we were among the first after the legal reform in 2020 to offer online medication counseling in all of our pharmacies nationwide. We have also been implementing automation in our dispensing operations from before. In other words, actively adopting cutting-edge technologies is part of our culture.” Because of their track record and culture, things went very smoothly when Panasonic proposed this demonstration experiment to AIN through the council.

According to Ms. Kim, there are patients that seek online medication counseling because of COVID-19, or simply because it is more convenient, but it had been difficult to figure out how people could receive medication after such counseling sessions. “To make the entire process from online examination, online medication counseling, to the receipt of drugs contactless, usually, the drugs have to be delivered so this meant a time lag of at least one day. We wanted to offer multiple delivery options, so in FSST, people can choose delivery lockers.” Another option people can choose from now is delivery by an autonomous delivery robot.

“Not only can patients enjoy contactless delivery, getting their medication delivered by a robot adds an extra layer of excitement. During this field experiment, we want to gain feedback about various needs and insight about the relaxation of healthcare regulations going forward,” added Ms. Kim.

Masaki Yamauchi from the Innovation Promotion Sector, Panasonic Corporation noted, “The logistics industry is equipped with well-developed systems that connect a vast area peer-to-peer, but there is no delivery infrastructure that can respond to needs in a small area on demand. This is the first step to creating such an infrastructure.” To do so, it is important to conduct demonstration experiments and gain experience in the field.

“As history has shown, inefficient systems have been overwritten by digital transformation to become more convenient. Nowadays people want to limit contact with others as much as possible. So, they are more receptive to robots delivering products, and society is ready to recognize autonomous delivery as something of value. At FSST, residents and companies work together in unison, so it is the perfect place to paint a picture about what the future could look like,” stated Mr. Yamauchi.

Many people focus on the negative impact of COVID-19, but both parties have found a silver lining amid the crisis. Ms. Kim stated, “We would like to move forward while thinking about what value we can provide amid this difficult situation.” And according to Mr. Yamauchi, “This will help transform the delivery of goods in a region. I am proud to know that Panasonic will be able to contribute to bringing about change to society.” The hope is that the field experiment will prove that this kind of community-centric micro mobility has significant potential.

Realizing autonomous delivery in less than a year

Motoki Hirose of the Manufacturing Innovation Division, Panasonic Corporation has worked on the development of this autonomous delivery robot. He repurposed the technologies applied to the robotic mobility, “PiiMO” that helped developed, that is already available in the market.

“PiiMO is a robot-type electric wheelchair developed for use in large indoor facilities such as airports and shopping malls. The first wheelchair will be operated via remote control by staff, but other wheelchairs that follow suit run autonomously by detecting the movements of the wheelchair in front of it using laser sensors. We combined this technology with the technology developed by the division in charge of the outdoor autonomous ride share service to create this autonomous delivery robot,” explained Mr. Hirose.

Panasonic started back in early FY2021. In under a year, they developed an outdoor autonomous delivery robot that can travel along public roads. “The environment changed rapidly due to COVID-19, so we anticipated that the demand for contactless delivery will increase. The national government has also become more proactive about the development of delivery robots, so our engineers joined forces and decided to take on this challenge,” said Mr. Hirose.

Although the basic autonomous mobility technology was ready, they faced one challenge after another because it was the first time for a robot to move about autonomously on public roads. How will sensing work to ensure safe mobility in an environment where vehicles, bicycles, and pedestrian can appear out of nowhere any second? How can they switch seamlessly between the remote control system and autonomous mobility system? How can the robot continue to move around stably on bumpy roads, and maneuver in an environment with various external, disruptive factors such as sunlight and natural objects? Every time they ran into an unexpected issue, they analyzed the data and resolved the issue by applying “agile methodology.” According to Mr. Hirose, “Once the demonstration experiment began, people working in the field and managers also began proactively offering their input and recommendations. It has been quite a good dynamic.”

“Being involved in manufacturing, I keenly sense that technology evolves because there are people who use it and places where it can be used. The field experiment that began in March will provide direct feedback from customers. Being able to receive direct feedback is very valuable and it provides insight that can be used to make improvements. Residents of FSST have deep trust for Panasonic, so it is an optimal environment for us,” added Mr. Hirose.

When you look at the robot straight on, you will notice that it has round eyes and eyebrows. Eyebrows change depending on whether it is turning right or left, giving the robot a unique expression. When it happens upon a pedestrian, you will hear the kind voice of the operator say through the speakers, “After you.” This is because Panasonic wanted to create a robot that blends in with the town. Some residents even wave to the robot when it passes by.

“Even if it may be remote, it is important that we can intervene and control the robot. During the testing on public roads in phase 1, children came up to the robot and began speaking to the operator in the control center. Since then, every time they see the robot, they wave to it with a big smile shouting, ‘Good luck!’ We realized that rather than simply responding to issues that may arise, creating a warm interactive rapport will help create a better relationship between robots and people than was ever possible,” concluded Mr. Hirose.

Mr. Arakawa of the Fujisawa SST Management Company feels the same.

“As things become more automated, people may think that this dilutes the relationship between people, but it’s actually the opposite. Offline interaction becomes more valuable as things shift online. Of course, there are things that can be resolved by technology, and things that can only be resolved by a real community. I believe that in this town companies and residents will be able to talk to each other and build a future together with their own hands.”

FSST has continued to be open-minded, and their residents have updated the town periodically. These updates also help people grow and contribute to sustainability. Mr. Arakawa mentioned that he wanted FSST to become a “highly resilient town,” but it seems as though the residents of this town are equipped with an innate ability to adapt to change. A world where humankind and robots live in harmony is about to begin, here at FSST.

“Our deal with Udelv is significant for its size, scope and rapid deployment timeline, demonstrating our ability to deliver Mobileye Drive™ for commercial use now and in volume,” said Prof. Amnon Shashua, Mobileye president and CEO. “COVID-19 has accelerated demand for autonomous goods delivery, and we are delighted to partner with Udelv to address this demand in the near term.”

Daniel Laury, CEO and co-founder of Udelv, said: “Mobileye is the only company providing a full-stack self-driving system with commercial viability and scale today. The readiness of Mobileye Drive™, along with its vast map coverage of North America, Europe and Asia, will allow us to ramp up the production and deployment of Udelv Transporters and rapidly offer the service at scale to our expanding list of customers.”

Last-mile delivery is the most expensive aspect of distribution, accounting for 53% of the overall cost of goods. At the same time, consumers are buying more and more goods online which is expected to raise urban last-mile delivery volume by 75 to 80% by 2030 and require 36% more delivery vehicles. And a shortage of drivers is making it difficult for companies to keep pace. It is a service model that is ripe for improvement.

Udelv’s customers expect Transporters to dramatically improve the efficiency of last- and middle-mile delivery services for everything from baked goods and auto parts to groceries and medical supplies.

Donlen, one of America’s largest commercial fleet management companies at the forefront of fleet management innovation and technology, today placed the first pre-order for 1,000 Transporters. This pre-order is believed to be the largest to date for an autonomous delivery vehicle.

“We are thrilled to be the first customer for the Udelv Transporter,” said Tom Callahan, president of Donlen. “The combination of Udelv’s zero-emissions Transporter and automated delivery management system with Mobileye Drive™ will enable sweeping delivery cost reductions, make our roads safer, and lower carbon emissions across America.”

Mobileye Drive comprises EyeQ™ system-on-chip-based level 4 (L4) compute, sensors and software, the company’s proprietary Road Experience Management™ AV mapping solution and Responsibility-Sensitive Safety-based autonomous driving policy. Udelv will perform the integration with its Delivery Management System, with Mobileye providing technical oversight. Mobileye will also provide over-the-air software support.

Mobileye-driven Transporters will be capable of L4 self-driving, point-to-point operation. Udelv’s proprietary tele-operations system will allow for the maneuvering of the vehicles at the edges of the mission, in parking lots, loading zones, apartment complexes and private roads.

Celebrated for creating the world’s first custom-made ADV that completed the first autonomous delivery in early 2018, Udelv has quietly performed extensive deployment trials with customers across various industries.

As one of Udelv’s early customers, Mike Odell, president and CEO of XL Parts and Marubeni Automotive Aftermarket Holdings, said: “We placed our trust in Udelv’s technology two years ago and are thrilled to witness the progress this company has made in such a short period of time. XL Parts remains committed to expanding its partnership with Udelv and to being one of the first clients for the Transporters.”

The deal with Udelv advances Mobileye’s global mobility-as-a-service ambitions, validating the company’s technology and business approach. Mobileye plans to deploy autonomous shuttles with Transdev ATS and Lohr Group beginning in Europe. Mobileye also plans to begin operating an autonomous ride-hailing service in Israel in early 2022.

Interview with BMW Group Automated Driving

Dr. Nicolai Martin is in charge of a stimulating and highly complex area of innovation at the BMW Group, namely Automated Driving Development. It is his job to deal with questions such as: what degree of automation is useful? What does the future of automation really hold for the customer?  And, most importantly: what route is BMW taking to automated driving? This interview offers intriguing insights, highlights the latest developments and outlines ideas for the future.

1. Dr. Martin, you are an engineer at heart. What is your driving force?

I enjoy getting to the bottom of things and understanding them. Competitive sport, namely windsurfing, all the way up to Olympic level was a genuine career option, but in the end industrial engineering and automotive engineering came out on top.

I love the sort of challenges and problems you have to really get to grips with and are complex and relevant to society at the same time. Because what I ultimately want to do is find long-lasting solutions to relevant, real-life problems and drive progress. I am fortunate enough to be working with a great team on one of the most relevant assignments for the future of automotive mobility: automated driving.

On the one hand, we actively promote the development of innovative technologies and conduct research that is virtually of academic standard. But we also carefully consider which of the potential applications we’re actually going to implement in order to offer customers worldwide genuine added value. At the end of the day, our overriding aim is to delight our customers. It’s a fascinating balance that isn’t always easy to get right.

2. What added value is offered by a car with automated driving capability?

Automation basically enhances comfort and safety, as the system drives the car consistently, whereas we as humans tend not to. Our customers appreciate this assistance, which eases their workload in some cases. For example, we have observed that drivers in Europe who have our Driving Assistant Professional system in their car are already driving with the longitudinal guidance function activated for approx. 50 per cent of the time. The figure for lateral guidance is lower, but it is still used for 30 per cent of the driving time at present (calculated from 120 million kilometres / approx. 31.35 million miles driven by our customers). That’s a lot. The conclusion we draw from this is that we have created a function that truly offers added value. In general, our systems are extremely well received worldwide and have also been shown to make driving safer already today.

3. What role will automated driving play in future in the context of personal mobility?

A vehicle’s intelligence will become increasingly important in future. Automation of the task of driving has an instrumental role to play here. What started out with more minor features, such as automatic control of the lighting functions, has today already progressed to assisted longitudinal and lateral vehicle control. When driving from Munich to Tuscany for a holiday, for example, not only does the system take care of switching the headlights on and off when passing through the numerous tunnels en route, it also keeps the vehicle within the speed limits and at a safe distance from vehicles ahead. If we go further and add all the possibilities offered by connectivity and the driver’s semantic knowledge, the car will turn more and more into an intelligent companion or even friend that helps and excites the driver. We are expanding on this aspect step by step.

4. How far has BMW progressed with automated driving?

We already have around 40 driver assistance functions in our vehicles that are rated among the best on the market. These encompass everything from the High Beam Assistant and the rear-view camera to intelligent cruise control with longitudinal and lateral guidance, complete with traffic light recognition. On the active safety side, our driver assistance systems help us to achieve the highest 5 star NCAP rating, while the top-spec Driving Assistant Professional has already won awards. In short, we are continuing to enhance our Level 2 functions.

These are almost universally available and are already of assistance or use to customers today when driving and parking. Their characteristics or availability may vary depending on local legislation. In the USA and China, for instance, we offer a “hands-off” option, which lets the driver take their hands off the steering wheel (up to 60 km/h / 37 mph), although they must continue to monitor the driving situation and remain responsible for the car. This function is deactivated (after several warning signals) if the driver is no longer paying attention.

5. And when will BMW begin offering Level 3 to customers?

As well as refining our driver assistance functions (Level 2), we are also working hard to ensure our vehicles are capable of highly automated driving – i.e. Level 3 – and are making very good progress here. Our automated assistance functions already perform a significant part of the driving and parking task in many situations. But the driver must still monitor the vehicle’s surroundings and is always fully responsible for how the vehicle is being driven. We have already introduced driverless parking for vehicles in the form of the Remote Control Parking function. However, here too the driver has to monitor their vehicle and the area around it via smartphone or the vehicle key, and they are still responsible for the vehicle. This means Level 4 functionality is within reach when it comes to parking, i.e. the vehicle can look for a space and park itself in a car park, for example. We will not offer Level 3 functionality (where responsibility passes from human to machine) in our vehicles until it is absolutely safe and offers added value. The system must react safely in extreme situations – the “corner cases” as they are known. This is what we are striving to achieve.

6. Does your approach differ to that of your competitors? In other words, is there a “BMW route” to automated driving?

We are developing automated driving with a clear objective: to offer our customers greater safety and comfort. The BMW Group primarily sees technology as an enabler for using automated driving and parking functions to create a positive and emotionally engaging experience for our customers. That is a clear priority for us. At the same time, technology must never take all the decision-making away from the driver. Finding the right balance between safety for everyone and added value for the individual is important to us.

In my view, BMW will in future embody the ideal blend between the world of automated driving, or “Ease” as we call it, and the pleasure of driving yourself, known simply as “Boost”. Every customer should be able to decide for themselves whether they want to take the wheel and enjoy some dynamic driving pleasure or prefer to hand over the driving task in certain stressful or joyless driving situations – such as traffic jams, stop-start traffic or parking – and use that time for something else. In the process, a BMW will provide its driver with optimum assistance and backup at all times. I became very familiar with the two sides of the coin during one of my first jobs at the company in active driving dynamics development, when I had the chance to explore and experience their respective limits. So, it is clear to me that a BMW will consist of both dimensions in the future.

7. What can we expect from the all-electric BMW iX in terms of automated functions?

The BMW iX is the first model from the BMW Group to offer automated driving and parking functions based on a new technology toolkit. This toolkit will enable continuous improvement and expansion of the driver assistance functions and, in the medium term, highly automated driving (Level 3). We will continue with the rollout of the toolkit and deploy it, for example, in the next-generation BMW 7 Series and BMW 5 Series models.

In the BMW iX we are also creating real added value for customers by grouping together individual automated assistance functions intelligently and according to relevant driving situations. The new BMW Operating System 8 makes our driver assistance functions even more user-friendly. At the same time, we have reduced controls to the essentials, ensuring that the driver can activate the optimal degree of assistance quickly. Our focus here is on overall, intelligent automation, simplification of system status and intuitive operation. This simplification can be clearly seen in the reduced number of buttons on the multifunction steering wheel, for example.

The truth is that reality – you might say complexity – has caught up with many ambitions in terms of higher levels of automation, or rather their market readiness and availability. Just because a vehicle is Level 3-capable, that certainly doesn’t mean that Level 3 automated driving is either permitted or possible everywhere. In fact, this is only currently allowed on a very small number of roads under very specific conditions. We are therefore actively deliberating when the right time would be, as we only want to offer relevant and, most importantly, completely safe functions. This means we have also scrutinised our ambitions with a critical eye and are now continuing our work intently based on a revised roadmap.

8. One critical issue affecting automated driving is the willingness of drivers to use the functions. How do you ensure this acceptance?

Acceptance varies greatly from region to region. We endeavour to gain customers’ trust by adapting vehicle behaviour so they recognise themselves in it. Safety always takes top priority here, so we make the system more defensive in nature, meaning that customers will never be caught off guard by overly aggressive driving. A coherent display and operating concept is another important factor here: how do we show the customer in a clearly visible way, for example, that the sensor has detected the vehicle ahead and the function is active? In future, it might even help to deliberately make one or two of the sensors stand out instead of tending to conceal them as we do today. But that is just one of many different aspects. As I said at the start, here we are working on one of the most exciting areas in the automotive industry.