There was more news from Mobile World Congress from BMW, SEAT and Geely.
BMW is showing Natural Interaction. The new system combines the most advanced voice command technology available with expanded gesture control and gaze recognition to enable genuine multimodal operation for the first time. The first BMW Natural Interaction functions will be available in the BMW iNEXT from 2021.
Just like in interpersonal dialogue, BMW Natural Interaction allows the driver to use their voice, gestures and gaze at the same time in various combinations to interact with their vehicle. The preferred mode of operation can be selected intuitively, according to the situation and context. Voice commands, gestures and the direction of gaze can be reliably detected by the vehicle, combined and the desired operation executed. This free, multimodal interaction is made possible by speech recognition, optimised sensor technology and context-sensitive analysis of gestures. Through precise detection of hand and finger movements, gesture direction – in addition to gesture type – is also registered for the first time in an extended interaction space that encompasses the driver’s entire operating environment. Spoken instructions are registered and processed using Natural Language Understanding. An intelligent learning algorithm, which is constantly being refined, combines and interprets the complex information so that the vehicle can respond accordingly. This creates a multimodal interactive experience geared towards the driver’s wishes.
By combining different modalities, vehicle functions can be initiated in different ways. The driver decides how they want to interact, based on their own personal preferences, habits or the current situation. So, when the driver is engaged in conversation, they would probably choose gesture and gaze control; when their eyes are on the road, better to rely on speech and gestures. In this way, for example, car windows or the sunroof can be opened or closed, air vents adjusted or a selection made on the Control Display. If the driver wants to learn more about vehicle functions, they can also point to buttons and ask what they do.
With enhanced gesture recognition and the car’s high level of connectivity, the interaction space is no longer confined to the interior. For the first time, occupants will be able to interact with their direct surroundings, such as buildings or parking spaces. Even complex queries can be answered quickly and easily by pointing a finger and issuing a voice command. “What’s this building? How long is that business open? What is this restaurant called? Can I park here and what does it cost?”
The advances in recognition and evaluation of voice commands, gestures and gaze required for natural driver-vehicle interaction are delivered by improved sensor and analysis technologies. Using an infrared light signal, the gesture camera can now capture hand and finger movements in three dimensions throughout the driver’s entire operating environment and determine a precise directional vector. For example, pointing a forefinger at the Control Display and saying a command is sufficient to initiate the desired operation without touching the screen. The high-definition camera integrated into the instrument cluster also registers head and eye direction. The built-in camera technology evaluates the images and uses them to calculate the required vector data, which is then processed in the vehicle. To interpret voice instructions quickly and reliably in addition to gestures, the information transmitted by the driver to the vehicle in a multimodal manner is combined and evaluated with the help of artificial intelligence. The algorithm responsible for interpreting the data in-car is continuously optimized and refined using machine-learning and evaluation of different operating scenarios.
Beyond the car: environmental interaction through connectivity.
Thanks to intelligent networking, the area of BMW Natural Interaction extends beyond the vehicle interior. For example, the driver can point a finger at objects in their field of vision and give related voice commands, such as asking for information about opening hours or customer ratings, or reserving a table at a restaurant. Thanks to the vehicle’s depth of connectivity, extensive environmental data and artificial intelligence enable BMW Natural Interaction to transform the vehicle into a well-informed, helpful passenger. By connecting digital services, it will be possible to expand the scope of interaction in the future. For example, when the driver spots a parking space, they will easily be able to find out whether they are allowed to park there and what it costs, and then reserve and pay for it directly without ever pushing a button.
BMW Natural Interaction paves the way for the next stage of natural operation in the vehicle and beyond. The free combination of voice instructions, gestures and gaze creates multimodal interaction, based on interpersonal communication. The first BMW Natural Interaction functions will be available in the BMW iNEXT as early as 2021.
Qualcomm Geely
Geely announced that it plans to launch the first domestically mass-produced 5G and C-V2X-enabled vehicles in 2021. The launch of the new vehicles will be done in collaboration with Gosuncn Group and Qualcomm Technologies, Inc., a subsidiary of Qualcomm Incorporated, who will assist Geely with 5G and C-V2X products based on the Qualcomm® Snapdragon™ Automotive 5G Platform through Gosuncn.
SEAT revealed the Minimó. The all-electric quadricycle has been developed to help meet the challenges of city driving, emission regulations whether that’s stricter laws on what vehicles can enter urban areas, the increasing fatigue of traffic jams or the lack of parking spaces.
The Minimó brings together the benefits of the smaller dimensions of a motorcycle with the safety and comfort of a larger passenger vehicle, minimizing the many pain points of travelling around our cities.
The vehicle is designed to be narrow and agile making it easier to navigate around urban areas, but is enclosed and practical increasing safety and its ability to serve a wider number of drivers and their needs.
An all-electric powertrain with 0 emissions means that entering city centres with even the most stringent emissions legislation is no problem. Add to this the ability to swap battery packs to quickly and efficiently refuel the vehicle, reducing significantly the charging time for private customers and operation costs of an urban electric carsharing service.
Meeting the needs of an urban society means giving people the freedom to travel around without limits. The Minimó gives customers over 100km of range on a single charge of its energy dense battery pack. But because of its inventive design, there’s no need to wait hours for the vehicle to recharge once all the energy has been used.
Packaged under the floor of the vehicle in an accessible frame, the battery can be swapped with a fresh pack in a matter of seconds and with minimal fuss, allowing the vehicle to continue its journey far quicker than other vehicles on the road.
The vehicle offers connectivity solutions to provide a seamless digital experience for the user (private and sharing) based on digital key and wireless Android Auto™ technology. A central digital display behind the steering wheel combines the functions of cluster instrument and digital content from the user smartphone with the necessary safety driving requirements. With the Google Assistant on Android Auto, users can keep their eyes on the road and hands on the wheel while using their voice to stay connected, easily get answers, manage tasks and control media
And the platform is ready to open the door to future developments. The Minimó is preparing for future Level 4 autonomous technologies, which would allow the vehicle to pick up the user when requested, solving one of the main car sharing’s user pain point.
SEAT, a member of the Volkswagen Group and one of Europe’s biggest car manufacturers, along with IBM announced the development of a new solution designed to transform driving in cities.
Announced the day after SEAT presented its concept car Minimó promising to revolutionize urban mobility, ‘Mobility Advisor’ uses IBM Watson AI to help urban citizens make informed decisions about their daily transportation options including cars, scooters, bikes and public transport.
Currently under development and designed to run as a mobile app on 4G/5G networks, ‘Mobility Advisor’ uses IBM Watson Assistant to provide users with a conversational interface to plan and optimise routes and suggest the most suitable transportation options.
With IBM Watson Machine Learning, ‘Mobility Advisor’ can learn a user’s preferences and make personalized recommendations for how best to complete a journey. Connected to the IBM Cloud, it dynamically adapts to changing conditions by taking into account weather forecasts, traffic reports and things happening in the city that day. It incorporates the user’s appointments and historical data about previous choices in order to suggest the best modes of transportation each time – even if that means leaving the car behind, walking, or using one of SEAT’s e-Kick scooters for the part of the journey.
“With the roll-out of 5G networks in cities in the coming years, the possibilities for transforming the driver experience are limitless,” said Juan Ramon Gutierrez Villar, Industry Solutions Leader, IBM Global Markets. “At IBM, we are working with telecommunications companies and innovative manufactures like SEAT to provide the open technologies which they need to deliver on this vision and create highly contextualized and personalized user experiences that work at lightning speed across multiple clouds and IT platforms.”
SEAT’s R&D Mobility team will continue working with IBM technicians on the evolution of Mobility Advisor proof of concept and its different potential applications. Together with XMOBA, SEAT’s independent company which tests new solutions that contribute to better mobility, SEAT will analyse the future integration of Mobility Advisor with Justmoove, the collaborative mobility solutions platform which the company already offers to its customers. On IBM’s side, the project is being implemented by Viewnext, an IBM subsidiary in Spain.