Connected Car News: Green Hills Software, FedEX, FourKites, TankU, IAR, GM, UVeye, Mercedes-Benz, ZYNC & ROHM

In connected car news are
Green Hills Software, FedEX, FourKites, TankU, IAR, GM, UVeye, Mercedes-Benz, ZYNC and ROHM.

Green Hills Software Updates µ-velOSity

Green Hills Software, the worldwide leader in embedded safety and security, announced important updates to its µ-velOSity™ real-time operating system (RTOS) for the growing use of new microcontrollers in vehicle electronics. Even with the consolidation of functions in zonal and domain controllers, the required number of microcontroller cores to support the surge of new safety and real-time applications remains large and growing in new designs from OEMs and Tier 1s. To support these new real-time processors, µ-velOSity has been updated with new RTOS features for new processors, along with optional capabilities that can be tailored for specific customer requirements, including:

  • ISO 26262 ASIL certified (SEooC)
  • Cybersecurity – adopting ISO 21434
  • Memory Protection Unit (MPU) support
  • Supported 32-bit architectures found on NXP®, STMicroelectronics, TI and other processors:
    • Arm® Cortex®-M4, M7 and Arm Cortex-R5
    • Arm Cortex-M23 and Arm Cortex-R52
    • RISC-V
  • Support for additional architectures including future Armv8-M, Armv8-R and others

These new capabilities join the µ-velOSity RTOS’ existing features and capabilities including:

  • Maximum execution speed and minimal boot time
  • Tiny memory footprint
  • Simple native API
  • Open architecture supports domain-specific microcontroller accelerators
  • µ-velOSity applications using the latest revisions of C and C++

In the car, today’s microcontrollers need an RTOS that is purpose-built for the task. Cybersecurity, functional safety, cost and scalability are front of mind for OEMs for ECU nodes like e-fuses, battery management, zonal controllers, low-end radar and more, which in many cases have ultra-small memory footprint requirements. On the one hand, the RTOS must enable applications to fit in the limited internal-only memory of microcontrollers. On the other, the RTOS must be able to provide application layer enablement for the new domain-specific hardware features of modern microcontrollers and expose a unified application interface across different core architectures and silicon solutions. And finally, the software solution built on the RTOS needs to be certified to the highest levels of safety and security. The Green Hills µ-velOSity RTOS supports all these requirements.

µ-velOSity is also a perfect complement to the traditional use of AUTOSAR Classic because it can cover the class of applications, memory footprint, performance, and features not well suited for AUTOSAR Classic.

µ-velOSity is perfect for use in:

  • Smart e-fuses
  • Zonal and Domain Controllers
  • Battery Management Systems (BMS)
  • Communication Modules
  • Radar
  • Safety Islands / Safety Checker
  • Traction, Braking, Steering Systems
  • And many more

“The automotive industry is adopting increasingly capable microcontrollers that deliver the safety and performance capabilities it requires, and Arm technology is well-placed to meet this demand,” said Tom Conway, senior director product management, Automotive and IoT Line of Business, Arm. “The Green Hills µ-velOSity real-time operating system is a great addition to the ecosystem, making it easier and more cost-effective for developers to deliver robust and competitive systems based on Arm.”

“Green Hills is pleased to be seeing substantial adoption of its µ-velOSity RTOS in the next generation of vehicle architectures and automotive microcontrollers,” said Dan Mender, VP of business development, Green Hills Software. “With its aggressive cost profile, safety and security certifications and extremely small memory footprint, global OEMs and Tier 1s are achieving on-time deployments with measurable cost savings when using µ-velOSity.”

Availability
The Green Hills µ-velOSity RTOS and support for current processors is available today. An early-access version is being demonstrated in the Green Hills Software booth at Embedded World, Hall 4 Booth 325.

FedEX & FourKites Partner for Supply Chains

FedEx Corp. (NYSE: FDX) and supply chain visibility platform FourKites®  announced a strategic alliance that will provide businesses with new, more robust real-time visibility capabilities to help solve their most pervasive supply chain challenges, become more efficient, and unlock new growth opportunities. Events like the COVID-19 pandemic, ongoing geopolitical issues, port congestion, and other global disruptions have revealed the complexities in keeping interconnected supply chains around the world up and running. FedEx and FourKites are collaborating to make supply chains smarter by bringing comprehensive and highly granular visibility into multi-modal and multi-carrier operations with the deep network and rich insights of their combined networks.

FourKites’ real-time visibility platform has integrated intelligence at every point of the supply chain, and currently supports 2.5 million shipments a day, connecting the supply chains of 50% of Fortune 500 companies. Through this collaboration, FourKites will be using its machine learning and artificial intelligence capabilities with data insights from the FedEx network – which reaches more than 220 countries and territories, linking more than 99 percent of the world’s GDP, through more than 16.5 million shipments daily – to create a new end-to-end supply chain intelligence platform called FourKites X. FourKites X will provide tools and insights to help large shippers and logistics providers mitigate the impacts of sustained challenges via a suite of offerings ranging from dynamic planning and pre-shipment, to enhanced visibility and proactive alerts, to supply chain optimization insights.

To support this alliance and the launch of FourKites X, FedEx has made a strategic investment in FourKites.

“This is an exciting collaboration between two industry leaders and innovators coming together to unlock new opportunities for our customers,” said Sriram Krishnasamy, CEO, FedEx Dataworks. “If the last two years have taught us anything, it’s that companies need to work together in order to work smarter and faster. Our collaboration with FourKites creates a data ecosystem that will deliver a new level of predictability and visibility to help businesses build smarter supply chains in today’s unpredictable and complex business environment.”

“We are excited to announce this groundbreaking platform and strategic collaboration with FedEx,” said Mathew Elenjickal, FourKites Founder & CEO. “Our organizations share an unwavering commitment to customer success through strategic innovation. Together, we are working to pave the future of global supply chains, built on a foundation of data and machine learning to deliver new value to those global supply chains.”

FourKites X – A Complete Picture of Global Supply Chain Operations

FedEx Dataworks, a business unit in FedEx focused on making supply chains smarter through a powerful data science approach and machine learning capabilities, will support FourKites’ development of the new platform, FourKites X, that will help give customers deeper actionable insights, more accurate ETAs, and more intelligent supply chains to reduce supply chain volatility, and improve top-line growth.

The platform is being designed to help large shippers and logistics providers identify areas of opportunity, such as:

  • How chief supply chain officers can improve planning to address empty shelves and manufacturing slowdowns
  • How companies can use data to turn supply chain crises into chances to improve the customer experience
  • How chief financial officers can optimize supply chains for growth and efficiency

 

“When it comes to supply chain data, more is always better,” said Steve Banker, Vice President, Supply Chain Services, at ARC Advisory Group. “The collaboration between FourKites and FedEx is exceptional in both the volume of data that it will aggregate, and in the degree to which it could improve predicted times of arrival, planning, and more. FourKites X can be a big step forward for this market.”

Customers of FourKites X will be able to integrate the platform with existing systems, and they will be able to receive new capabilities in a modular way with support from data engineers, data scientists, and user experience experts. The FourKites X solution will include pre-built applications and an intuitive dashboard, which will allow customers to unlock things like pre-shipment weather advisories and supply chain insights.

TankU Joins NVIDIA Metropolis

TankU, an innovative AI company that is reshaping the world of automotive services, announced it has joined NVIDIA Metropolis — a partner program, application framework, and set of developer tools that bring to market a new generation of vision AI applications that make the world’s most important spaces and operations safer and more efficient.

TankU provides an optimized platform for harnessing the power of computer vision and artificial intelligence. Utilizing NVIDIA TensorRT and DeepStream SDK combined with the NVIDIA Jetson edge AI platform powering the latest generation of its smart video processors, TankU’s ground-breaking technology delivers a wide range of invaluable insights for service providers — automating and customizing services such as refueling, EV charging, car washing, collection of deliveries and more.

NVIDIA Metropolis makes it easier and more cost-effective for enterprises, governments and integration partners to leverage world-class AI-enabled solutions to improve critical operational efficiency and safety problems. The NVIDIA Metropolis ecosystem contains a large and growing breadth of partners who are investing in the most advanced AI techniques, most efficient deployment platforms and use an enterprise-class approach to their solutions. Partners have the opportunity to gain early access to NVIDIA platform updates to further enhance and accelerate their AI application development efforts. Further, the program offers the opportunity for partners to collaborate with industry-leading experts and other AI-driven organizations.

“In our ongoing endeavor to transform the customer interaction with vehicle services, we are always seeking new and innovative experiences. TankU is proud of its technological collaboration with NVIDIA — the global leader in AI technology platforms — and looking forward to expanding our work by joining the NVIDIA Metropolis program,” said Dan Valdhorn, founder and CEO of TankU.

About TankU

TankU utilizes the existing security cameras, various data sources and the power of AI to provide a personalized user experience and operational excellence.

Vehicle service providers used to have a personal relationship with their clients – a relationship that has been lost. Unlike the attendant, the fuel dispenser does not know if the driver feels well or angry, if he is on his way to work or to the supermarket. TankU’s artificial intelligence solutions make these interactions more personal and are commercially deployed with several customers worldwide.

IAR Systems Extensions

IAR Systems®, the world leader in software and services for embedded development,  proudly presented Visual Studio Code extensions for IAR Systems embedded software development solutions. Now available at Visual Studio Code Marketplace, these extensions enable developers to work in Visual Studio Code while at the same time taking advantage of the powerful capabilities of the IAR Systems’ software solutions specialized in embedded systems.

For many years, IAR Systems and Azure RTOS have delivered the highest level of product integration, including the Azure RTOS ThreadX kernel integration in the IAR Embedded Workbench debugger. This state-of-the-art debugger integration with Microsoft’s Embedded Tools extension includes the ability to view all Azure RTOS ThreadX objects, set thread-specific breakpoints, view suspended thread’s call stacks, and view the unique execution profile and performance monitoring features in Azure RTOS ThreadX. The new IAR Systems’ extensions bring the same high level and integration capabilities to the Visual Studio Code community. In addition, this gives IoT developers a seamless development experience from prototyping to delivered product, enabling a fully GitHub-based automated development workflow.

“IAR Systems has been a great partner with Microsoft as we’ve been working on Azure RTOS for embedded applications.  And now, we’re excited to bring the capabilities in the IAR Embedded Workbench to the millions of developers using Visual Studio Code. We look forward to seeing what developers build with this technology”, said Amanda Silver, Corporate Vice President at Microsoft.

“The Visual Studio extensions from IAR Systems open a bridge between the leading general code development environment and the leading embedded development environment to make the sum greater than the parts”, commented Anders Holmberg, CTO, IAR Systems. “This gives our mutual users the best of both worlds, enabling users to develop the next generation of smart embedded devices for a wide range of different use cases and workloads in the most efficient way possible.”

The Visual Studio Code extensions from IAR Systems are compatible with all the latest versions of IAR Embedded Workbench, and IAR Build Tools. In addition, the extensions can be used for other build systems, such as CMake, source control, and versioning extensions like GitHub.

“The new Visual Studio Code extensions on GitHub and Marketplace will make a difference on how IAR Systems interact with and potentially expand our audience, but also accelerate our expertise in code excellence, as we interact with new developers sharing knowledge and practice”, said Anders Holmberg, CTO, IAR Systems.

GM Invests in UVeye

UVeye, a provider of advanced vehicle diagnostic systems announced that it has received an investment from the capital venture arm of General Motors, GM Ventures, to help fund the development and commercialization of the company’s vehicle inspection technology.

UVeye also has entered into a commercial agreement with General Motors to explore the expansion of UVeye’s automated high-speed systems to GM dealerships throughout various markets.Serving as the venture capital arm of General Motors, GM Ventures strategically invests in startup companies that share GM’s enterprise vision of an all-electric, hands-free, and more seamlessly connected future, and are helping to position GM as a leading transportation technology enterprise.

As part of the strategic collaboration agreement, the two companies have agreed to work on a variety of vehicle-inspection technology projects involving used-car auctions, fleet operations and automotive dealership sales. In the future, UVeye plans to incorporate electric-vehicle and autonomous-driving platforms into its inspection databases as well.

UVeye systems use artificial intelligence, machine-learning and high-definition camera technologies to quickly and accurately check tires, underbody components and vehicle exteriors for defects, missing parts and other safety-related issues.

More than 4,000 GM dealerships will be eligible to purchase the vehicle-inspection equipment to use in their service lanes. The team will also explore applications for extending the technology to exterior scans and photography to generate online interest and potential sales for used vehicles.

“We are on a journey to create the best customer service experience possible and the implementation of UVeye into our dealership service lanes helps us do that,” said John Roth, GM global vice president, Customer Care and Aftersales. “Providing real-time, consistent and accurate feedback to our customers will help us ensure they are getting the best performance out of their vehicle.”

Amir Hever, UVeye’s CEO and co-founder, noted that automated inspection processes take seconds to complete and are significantly more accurate than time-consuming manual inspections commonly in use today.

UVeye currently has facilities in North AmericaEurope and the Asia Pacific region, including offices in IsraelJapanGermany and the United States. The company has formed strategic partnerships with numerous dealership groups, used-car auctions and vehicle fleets since it was founded in 2016.

GM dealerships have access to three high-speed UVeye systems that utilize a unique combination of proprietary algorithms, cloud architecture, artificial intelligence, machine learning and sensor-fusion technologies. They include:

  • Helios – An underbody scanner that detects a wide variety of problems including frame damage, missing parts and fluid leaks, as well as brake and exhaust-system issues.
  • Artemis – A system that checks tire quality. Within seconds it identifies tire brand, technical specifications, air pressure, tread depth, sidewall damage, alignment issues and whether or not a vehicle’s tires are mismatched.
  • Atlas – A 360-degree detection system that checks sheet metal and other external body components such as bumpers, door locks, grilles and windows.

Hever believes that UVeye shares a common vision with General Motors for improved service quality that can benefit dealers, service technicians, and customers alike.

“High-speed inspection equipment can serve as tools of empowerment for new- and used-car dealers,” Hever said. “We very much look forward to working with GM in the months and years ahead. Both companies share the same vision and sense of innovation and when it comes to vehicle quality, the future is a bright one.”

Early implementations of UVeye’s technology at a limited number of GM dealers in North America are already yielding positive results. As the collaboration continues, the two companies will look to expand the applications of the technology across GM’s global dealer network, enhancing the robustness of real time vehicle diagnostics and creating a more streamlined exchange of information between customers, their vehicles and service technicians.

Mercedes-Benz Partners with ZYNC for Streaming Entertainment

Mercedes-Benz Group AG has agreed to a partnership with ZYNC. The California‑based tech company will provide the world-first implementation of its premium in-car digital entertainment platform for Mercedes-Benz. By aggregating a wide range of owned and 3rd-party digital content onto a single turnkey platform, ZYNC will provide the platform as well as the interface between content partners and existing compatible Mercedes-Benz hardware. The aim of the partnership is to provide customers with a seamless digital entertainment experience tailored to the unique environment inside a Mercedes-Benz, while maximising the benefits of the company’s UI/UX technologies such as the MBUX Hyperscreen. Through ZYNC, Mercedes-Benz customers will be able to access to a wide range of renowned third-party global and local streaming services. The first application of ZYNC is planned for the end of this year in the EQS and S-Class in Europe, with rollout in further models and markets planned for 2023.

Be it news, sports, shows or films, the majority of people consume their favourite streamed content most of the time via their mobile device or television. However, the environment inside a Mercedes-Benz presents a specific array of conditions that offer the chance to provide an immersive cinematic experience of audio‑visual content that goes far beyond merely facilitating playback. Those conditions range from the size, format and position of screens and the bespoke arrangement of speakers to the type of content and the way it is navigated. By focusing exclusively on the circumstances inside the car, the ZYNC platform integrates seamlessly with Mercedes-Benz hardware as well as current and future operating systems to deliver content in a way that maximises the audio-visual sensation, engagement and ease-of-use.

ZYNC provides video streaming, on-demand content, interactive experiences, local video programming, sports, news, gaming and much more through a unified user interface. Over 30 streaming services are available from high-end global, regional and local partners, with additional partners and channels continuously integrated over the air. Most of these channels are included and do not need an individual subscription. The prerequisite for ZYNC is an active Mercedes-Benz me account with the MBUX Entertainment Package, which is currently free of charge for one year from booking and can then be extended for a fee in the Mercedes me portal (country-specific deviations possible).

“We want to offer our customers a unique entertainment experience in their vehicles. Technologically, this is only achievable through the interaction of the best hardware and software. Whether our ENERGIZING Comfort programs, the innovative MBUX Hyperscreen or Dolby Atmos – a Mercedes-Benz is unmistakable and can be experienced with multiple senses. Access to diverse video streaming offerings through ZYNC will soon take our holistic luxury promise to the next level. This underlines our claim to ‘Lead in Car Software’,” says Markus Schäfer, Member of the Board of Management of Mercedes-Benz Group AG, Chief Technology Officer, responsible for Development and Purchasing.

“Through our partnership with ZYNC, Mercedes-Benz aims to unleash the full potential of in-car ntertainment via benchmark-setting hardware like the MBUX Hyperscreen by transforming it into a portal that expands the MBUX digital experience beyond the car. As well as even smoother, more direct and personalised access to owned and third-party content such as ZYNC provides, augmentation with interactive services will help us further deepen customer engagement and dialogue,” adds Magnus Östberg, Chief Software Officer, Mercedes-Benz AG.

The partnership with ZYNC leverages and maximizes the inherent benefits of the advanced UI/UX of Mercedes-Benz infotainment systems, such as the MBUX Hyperscreen. The aggregation of content means 3rd‑party partners integrate once into the ZYNC platform. The highly scalable cloud-native platform is optimised for low latency on Mercedes-Benz vehicle hardware.

In real terms, this means customers will be able to access a constantly growing array of streaming services. The page layouts are optimised to make the best possible use of the visual quality and resolution provided by the specific infotainment systems and its screens. In line with the Mercedes-Benz’s software strategy, the ZYNC platform offers best-in-class in-vehicle video playback, including adaptive bitrates and minimal on-board caching requirements. Meanwhile, the ZYNC platform itself is compatible with multiple Digital Rights Management (DRM) formats.

At the core of all Mercedes-Benz developments in in-car entertainment and visual content is road safety. The partnership with ZYNC is no exception. When the vehicle is in motion, streaming services are restricted, e.g. to passenger screens or audio only in accordance with local market regulations. Looking to the future, Mercedes-Benz is prepared for the use of video streaming while the car is in motion where approval exists for Level 3 conditionally automated driving.

ROHM Intros Automotive Camera Modules for ADAS

ROHM has recently announced the availability of ISO 26262 and ASIL-B compliant PMICs, BD868xxMUF-C (BD868C0MUF-C, BD868D0MUF-C) for automotive camera modules that are increasingly being adopted in ADAS (Advanced Driver Assistance Systems).

The continuing evolution of ADAS in recent years has increased the number of onboard cameras. At the same time, introducing the concept of functional safety is taking on greater importance, as the failure of even one camera can lead to a serious accident, making it imperative for manufacturers of cars and vehicle components – including semiconductor suppliers – to comply with the international safety standard ISO 26262.

In 2018, ROHM successfully achieved ISO 26262 Development Process certification from German certification body TÜV Rheinland, and in 2021 launched the brand ‘ComfySILTM’ to contribute to the safety, security, and comfort of users and systems through products that support functional safety. As part of the ComfySILTM series, these ICs are ‘FS (Functional safety) process compliant’ products (the highest grade), indicating compliance with the ISO 26262 standard.

Meeting the strict requirements for functional safety allows these latest products to facilitate safety design in next-generation vehicles equipped with ADAS. Moreover, the 4 power supply systems (3 DC/DC + 1 LDO) necessary for automotive cameras are integrated into a 3.5mm × 3.5mm package, achieve the industry’s smallest size in comparable camera PMICs. They are equipped with an anomaly status notification mechanism such as abnormal voltage detection and feedback via I2C. This reduces the number of components by 3 compared to former solutions, which results into a 25% smaller mounting area compared to conventional solutions and contributes to smaller vehicle cameras. (The above 25% is just an example, further miniaturization is possible by optimizing for individual applications.) At the same time, a wide range of output voltage and sequence control settings can be configured to meet the varying requirements of CMOS image sensors from different manufacturers, simplifying development considerably.