In autonomous and self-driving vehicle news are Uber, Waymo, May Mobility, ZF, Autonomous Indy Challenge, Honda, SiLC & Cyngn.
Uber & Waymo Partner for Rides in Phonenix, Austin & Atlanta
Uber Technologies, Inc. (NYSE: UBER) and Waymo LLC announced an expansion of their existing partnership to make the Waymo Driver available to more people via Uber.
Beginning in early 2025, Waymo and Uber will bring autonomous ride-hailing to Austin and Atlanta, only on the Uber app. In these cities, Uber will manage and dispatch a fleet of Waymo’s fully autonomous, all-electric Jaguar I-PACE vehicles that will grow to hundreds over time. Riders who request an UberX, Uber Green, Uber Comfort, or Uber Comfort Electric may be matched with a Waymo for qualifying trips.
“We’re thrilled to build on our successful partnership with Waymo, which has already powered fully autonomous trips for tens of thousands of riders in Phoenix,” said Dara Khosrowshahi, CEO of Uber. “Soon, riders in Austin and Atlanta will be able to experience that same mobility magic, through a new fleet of dedicated autonomous Waymo vehicles, available only on Uber.”
“Waymo’s mission is to be the world’s most trusted driver, and we’re excited to launch this expanded network and operations partnership with Uber in Austin and Atlanta to bring the benefits of fully autonomous driving to more riders,” said Tekedra Mawakana, co-CEO, Waymo. “We’ve been delighted at the positive feedback from our Waymo One riders to date, and we can’t wait to bring the comfort, convenience, and safety of the Waymo Driver to these cities in partnership with Uber.”
Through this expanded partnership, Uber will provide fleet management services including vehicle cleaning, repair, and other general depot operations. Waymo will continue to be responsible for the testing and operation of the Waymo Driver, including roadside assistance and certain rider support functions.
May Mobility AD PRESTO in Contra Costa County
Contra Costa Transportation Authority (CCTA) and May Mobility, an autonomous driving (AD) technology company, today launched PRESTO, a shared autonomous vehicle (AV) service for the general public in Martinez, California and Contra Costa Regional Medical Center (County Hospital) patients. A goal of the service is to improve access to healthcare in the area by providing another reliable and convenient mode of transportation.
The free service will run Monday through Friday, first offering County Hospital patients rides from 2 p.m. to 6 p.m. and then opening to the general public from 6 p.m. to 10 p.m. Patients can book rides to or from the County Hospital and to a preferred pharmacy from a set list of locations by calling (925) 995-3797 or by arranging travel with a hospital representative.
From 6 p.m. to 10 p.m., the service will open to all residents in the area. Those interested can book rides to a list of designated stops within the service zone by using the May Mobility app, powered by transit tech leader Via, or by calling (925) 995-3797. In addition to stops at the County Hospital healthcare facilities and local pharmacies, the service zone includes stops that help connect Martinez residents to their community, including residential areas, shopping districts and downtown Martinez.
“CCTA is once again making history here in Northern California by offering a free, autonomous vehicle transportation service to City of Martinez residents,” said CCTA Chair Newell Arnerich. “We established the County Hospital element to aid patients who may need rides in the afternoon to resources in the community. As an added benefit to residents, during the evening hours, the service will carry passengers to an array of destinations in the City of Martinez.”
May Mobility equipped a fleet of seven Toyota Sienna Autono-MaaS vehicles with its patented Multi-Policy Decision Making (MPDM) AD technology. MPDM uses in-situ AI to learn in real-time by imagining thousands of “what-if” scenarios every second while it drives and then commits to the safest and most comfortable maneuvers within milliseconds. Each shared AV seats five passengers and all will have an attendant on board to answer questions and assist with passenger entry and exit if needed. Three of the AVs are also wheelchair accessible, with an ADA-compliant wheelchair ramp allowing entry and exit via the rear of the vehicle. The wheelchair-accessible vehicles seat up to three passengers, including one wheelchair user.
“May Mobility is dedicated to filling gaps in public transportation. With the PRESTO shared AV pilot in Martinez, we’re redefining how communities connect and move,” said May Mobility Chief Commercial Officer Manik Dhar. “We’re excited to see how our patented MPDM technology will service local residents and contribute to the broader adoption of AVs.”
CCTA and May Mobility have also partnered with County Connection, which provides fixed-route and paratransit bus service for communities in Central Contra Costa County.
“County Connection has partnered with our paratransit contractor to provide union drivers to serve as safety stewards in each Martinez ADS vehicle,” said County Connection General Manager Bill Churchill, “underscoring our commitment to innovative transportation deployments ensuring everyone has the freedom to travel safely and independently.”
Passenger feedback will play a critical role in shaping the future of autonomous mobility, as data from the City of Martinez PRESTO pilot will be used by federal transportation officials to advance standards in automated mobility. The pilot is funded by a grant from the U.S. Department of Transportation. Martinez marks the third location where CCTA has introduced an autonomous vehicle pilot program.
To learn more about PRESTO, visit https://ridepresto.com/martinez/
Automated driving: ZF and Infineon use AI algorithms to optimize software and control units for driving dynamics
Trucks automatically driving behind one another on the highway, ‘platooning’, or cars automatically changing lanes: Here vehicle movements have to be calculated and executed precisely and quickly without a human driver. Software and AI algorithms safely control the drive, brakes, front and rear wheel steering and damping systems. The more efficient the AI algorithms are, the better the available computing power can be used.
The ZF Group and Infineon Technologies AG (FSE: IFX / OTCQX: IFNNY) have jointly developed and implemented AI algorithms for the development and control of vehicle software as part of the EEmotion project. The project was co-funded by the German Federal Ministry for Economic Affairs and Climate Action. The AI algorithms developed in the project, proven in a test vehicle, control and optimize all actuators during automated driving according to the specified driving trajectory.
ZF has added AI algorithms to its two existing software solutions cubiX and Eco Control 4 ACC, which have been implemented on Infineon’s AURIX™ TC4x microcontroller (MCU) with integrated Parallel Processing Unit (PPU). The result: More efficient artificial intelligence algorithms and better utilization of computing power. This in turn leads to better driving performance and increased driving safety. Compared to conventional approaches without AI, the two companies have now proven that their solution can for example carry out automated lane changes much more accurately. The energy efficiency of driver assistance systems such as Adaptive Cruise Control has also been increased. The improved driving performance combined with lower computing power demands paves the way for cost-efficient Level 2+ assistance systems.
“The EEmotion funding project shows that our artificial intelligence-based algorithms provide our customers with new advantages: AI makes it possible for products to be equipped with new functions and to be developed faster and more efficiently,” said Torsten Gollewski, Head of Research and Development at ZF.
“With our world-leading semiconductor products, software and services, Infineon enables customers to develop their own AI applications,” said Peter Schiefer, President of Infineon’s Automotive Division. “Our AURIX TC4x is ideally suited for in-car AI applications because its Parallel Processing Unit enables the fast and parallel processing of data that is essential for artificial intelligence, heralding the next stages of automated and ultimately autonomous driving.”
“The EEmotion project successfully integrated artificial intelligence into the safety-critical functions of the vehicle control system; this was validated on the software side, making further progress towards highly automated driving possible,” said Ernst Stoeckl-Pukall, Head of the Digitization and Industry 4.0 department at the Federal Ministry for Economic Affairs and Climate Action. “The project has thus provided important impetus that has strengthened the innovative power and competitiveness of the German automotive industry.”
Software-based chassis control optimized with AI
ZF’s cubiX software makes it possible to control all chassis components in passenger cars and commercial vehicles. This includes longitudinal and lateral dynamics as well as the vehicle’s vertical dynamics. In addition, the Eco Control 4 ACC predictive cruise control system is being further developed using a computationally intensive optimization algorithm and model-predictive control to achieve as much as 8 percent more range under real driving conditions. The EEmotion project has also developed AI algorithms which are applied as early as during the development phase. This allows more efficient design of vehicle software which can then be made available to customers more quickly. The accelerated and AI-supported application of vehicle software offers vehicle manufacturers clear advantages when adapting to different vehicle models.
Microcontroller from Infineon enables use of AI algorithms
Lean, AI-based algorithms require a lot of computing power, which makes it advisable to integrate them into high-performance microcontrollers such as the AURIX TC4x. Infineon’s AURIX TC4x microcontrollers offer high real-time performance and implement the latest trends in AI modelling, virtualization, functional safety, cybersecurity and networking functions. They pave the way for new E/E architectures and the next generation of Software-Defined Vehicles (SDV). An important part of the AURIX TC4x is the Parallel Processing Unit, which supports powerful AI applications thanks to fast and parallel data processing.
About the EEmotion project
The EEmotion project’s objective was to develop an AI algorithm-based control system for automated driving that ensures more precise trajectory control in various driving situations. The implementation of the project included the definition of the requirements for the AI-based functions, the development of an overall concept and corresponding hardware as well as developing the integration of AI in control architectures for safety-critical applications. It also took aspects such as the development of secure AI-monitored communication and the investigation of the simulative development and validation of vehicle dynamics systems into account. Infineon Technologies AG acted as the consortium coordinator for the project, with a total volume of 10.4 million euros, 59 percent of which was funded by the German Federal Ministry for Economic Affairs and Climate Action. The project ran from September 2021 to August 2024 and included partnerships with ZF Friedrichshafen AG, b-plus technologies GmbH, samoconsult GmbH, RWTH Aachen University and the Universität zu Lübeck.
Indy Autonomous Challenger Winner
On Friday September 6th, 2024, the Indy Autonomous Challenge (IAC), a global leader in high-speed autonomy, held the seventh edition of its autonomous racing competition returning to the famed oval at Indianapolis Motor Speedway and setting new world autonomous speed records for top speed on a racetrack (184 mph / 296 kph) and fastest passing overtake (180 mph / 290 kph). PoliMOVE-MSU (Politecnico di Milano, Michigan State University) won the passing competition with a record-setting overtake, and Cavalier Autonomous Racing (University of Virginia) emerged victorious with world record speed in the time trials competition. These two teams delivered remarkable performances powered by cutting-edge AI driver software. Their success in outpacing eight other competitive university teams highlighted the exciting future of automotive innovation.
Honda Invests in SiLC Tech
SiLC Technologies, Inc. (SiLC), announced it has received an investment from Honda to develop next-generation FMCW LiDAR solutions for all types of mobility. SiLC is the leading developer of integrated, single-chip FMCW (Coherent) LiDAR solutions and focuses on enabling advanced AI-based machine vision. Honda invests in innovative startups through its global open innovation program, Honda Xcelerator Ventures. This program is led by Honda Innovations, a subsidiary of Honda Motor Co., Ltd.
“SiLC is the industry leader in the research and development of FMCW LiDAR, which is capable of detecting vehicles and various obstacles from long distances – and Honda has high expectations for its potential,” said Manabu Ozawa, Managing Executive Officer, Honda Motor Co., Ltd. “Honda is striving for zero traffic collision fatalities involving our motorcycles and automobiles globally by 2050. We believe that SiLC’s advanced LiDAR technology will become an important technology for us. Honda continues to discover, collaborate with and invest in innovative startups like SiLC through our global open innovation program, Honda Xcelerator Ventures.”
Paving the Way to Safer Roads and Communities
Equipping vehicles with advanced AI vision capabilities that autonomously detect hazards, predict movements and prevent accidents can significantly enhance safety, efficiency, and reliability. This paves the way for a future where autonomous transportation becomes the norm, reducing traffic congestion and minimizing human error on the roads. Current technologies do not support these critical capabilities. For example, Advanced Driver Assistance Systems (ADAS) can fail to detect objects at longer distances in diverse environments. Rapid detection of moving objects and early determination of intent can lead to fast reaction times that could make the critical split-second difference between life and death.
To ensure safety and autonomy in all situations, the deployed vision systems must be powerful, efficient in computation, compact, scalable, and resilient to complex, unpredictable conditions, including interference from other systems. These requirements exclude existing time-of-flight-based solutions and highlight FMCW LiDAR as the ideal platform.
The FMCW LiDAR Advantage
SiLC is tackling several critical challenges in vehicle vision technology using high-performance FMCW LiDAR. Current radar and 2D/3D vision systems struggle with detecting dark objects, such as tires, at distances of a couple of hundred meters. They also have difficulty distinguishing between slow-moving and stationary vehicles for effective evasive actions. Most time-of-flight LiDARs suffer from interference caused by sunlight, retro reflectors, and other LiDAR systems. SiLC’s FMCW LiDAR overcomes these obstacles by detecting objects at distances of a kilometer or more with precise distance measurements and can measure the velocity of objects, allowing the system to predict their movements accurately.
As the global machine vision and robotics market continues to expand, SiLC’s innovations are poised to play a pivotal role in shaping the future of autonomous technology. The company’s Eyeonic Sensor and Vision System represent significant advancements in machine vision, providing machines with the information necessary to navigate and interact with the physical world effectively. These systems leverage FMCW LiDAR technology to equip machines with super-human vision capabilities, addressing the critical need for accurate, real-time perception in various sectors to enable the level of dexterity and hand-eye coordination needed for all tasks. SiLC is a performance leader in the market and offers the industry’s longest-range LiDAR and highest level of precision, setting a new benchmark in the field.
Dr. Mehdi Asghari, SiLC Technologies CEO, added, “This investment from Honda, the world’s largest mobility vehicle manufacturer, will accelerate our progress toward a society with fully autonomous solutions that enhance our safety and address our widely spread labor shortages across many critical markets. Our silicon photonics platform offers a powerful, low-cost, efficient and scalable FMCW LiDAR engine, which is essential for the high volumes required by the automotive industry. SiLC’s FMCW LiDAR solutions bring superior vision to machines to truly enable the next generation of AI-based automation and move us closer to a more intelligent, interconnected future.”
About SiLC Technologies
On a mission to enable machines to see like humans, SiLC Technologies is enabling advanced AI-based machine vision. Bringing forth its deep expertise in silicon photonics, the company deploys FMCW LiDAR-based vision solutions. Notably, their breakthrough 4D+ Eyeonic Chip integrates all photonics functions needed to enable a coherent vision sensor, offering a tiny footprint while addressing the need for low cost and low power. SiLC’s innovations are targeted to robotics, mobility, perimeter security, industrial automation and other leading markets.
SiLC was founded in 2018 by silicon photonics industry veterans with decades of commercial product development and manufacturing experience. SiLC’s 4D LiDAR chip has been recognized by Frost & Sullivan as ideally positioned to disrupt the global LiDAR market and the company has been named a Cool Vendor in Silicon Photonics by Gartner. Investors in SiLC include Dell Technologies Capital, Sony Innovation Fund by IGV, FLUXUNIT – ams OSRAM Ventures, Alter Venture Partners and Epson.
Cyngn DriveMod Outdoors
Cyngn Inc. (the “Company” or “Cyngn”) (Nasdaq: CYN) today announced that its AI-powered autonomous driving solution, DriveMod, will be able to operate in outdoor environments. Organizations will be able to send the DriveMod Tugger on missions that go indoors and outdoors, giving facility managers even more opportunity to automate repetitive workflows and shift employees over to more interesting, higher-value tasks.
As modern warehouses and manufacturing complexes grow ever larger—many surpassing 200,000 square feet and spanning multiple buildings—the challenge of efficiently moving materials from point A to point B becomes increasingly complex. By extending DriveMod’s capabilities outdoors, Cyngn provides organizations with a solution that eliminates bottlenecks in material movement, from transporting goods between outdoor storage areas to facilitating smoother transitions across multi-building facilities.
“One of the biggest pain points businesses face is the wasted time and resources involved in transporting materials between buildings,” said Sean Stetson, Cyngn’s VP of Engineering. “This time-consuming task ties up equipment and pulls workers away from where they’re most needed, resulting in expensive lost productivity. By automating these tasks, companies can eliminate these inefficiencies, shifting workers to other responsibilities.”
Given the costly challenge of moving materials between buildings in a large site, several companies have engaged Cyngn to automate outdoor operations. By integrating customer feedback into our roadmap, Cyngn will address the complex challenge of efficient outdoor material movement, unlocking significant operational potential and cost savings for organizations looking to improve resource utilization and maximize productivity.
The ability to operate outdoors opens new doors for DriveMod users. It empowers facility managers to automate the movement of goods in previously manual outdoor workflows, creating a fully connected system between indoor and outdoor operations. This capability has the potential to unlock cost savings and increase operational efficiency across industries like logistics, manufacturing, and distribution.
“Businesses are asking for more than just indoor efficiency. As a result, this marks a major milestone in broadening our reach and catering to the diverse needs of customers,” said Cyngn Chief Executive Officer, Lior Tal. “The future of automation isn’t just about optimizing indoor spaces; it’s about creating smarter, more flexible solutions that cater to the full spectrum of operational environments.”
Vention Machine Motion
Vention, the company behind the cloud-based Manufacturing Automation Platform (MAP), is launching an AI-enabled motion controller, MachineMotion AI, in Chicago today. This 3rd-generation controller, built on NVIDIA accelerated computing, is designed to significantly simplify the development and deployment of robotics applications for manufacturers of all sizes.
The announcement will take place during Vention’s annual product launch event, Demo Day 2024. The event will include a fireside chat featuring Etienne Lacroix, Vention’s CEO and Founder, and Deepu Talla, NVIDIA’s Vice President of Robotics and Edge Computing. The conversation will be moderated by Rob Pegoraro, a contributor to Fast Company, and will focus on the theme of “Artificial Intelligence, Real Manufacturing Use Cases.”
This announcement signifies a major advancement in entering the post-PLC (programmable logic controller) era. Automated equipment—including robots, conveyors, and computer vision systems—can now be orchestrated by a single controller powering the entire machine, making it truly plug-and-play. By eliminating the traditional divide between robots and PLC programming, this architecture makes programming simpler and speeds up the deployment cycle, leading to improved ROI for manufacturers.
MachineMotion AI is compatible with leading robot brands, including Universal Robots, FANUC, and ABB. It delivers up to 3,000W of power and drives up to 30 servo motors via EtherCAT.
Powered by the NVIDIA Jetson platform for edge AI and robotics, MachineMotion AI advances AI-enabled robots with NVIDIA GPU-accelerated path planning and the ability to run 2D/3D perception models trained in synthetic and physical environments.
As the centerpiece of Vention’s hardware and software ecosystem, key MachineMotion AI features include:
- Plug-and-play: Integrates easily with hundreds of motion devices and sensors—from robot arms, conveyors, and actuators to safety devices, computer vision systems, telepresence cameras, and more.
- Compact and robust design: Housed in an award-winning IP54-rated enclosure with passive cooling and a 360° beacon light for status communication.
- Intuitive programming: Supports both code-free and Python programming, either in the cloud with digital twin simulation, or on the edge with the physical twin machine.
- Powerful performance: Delivers up to 3,000W of power and drives up to 30 distributed and daisy-chainable servo motors via EtherCAT.
- Compatibility: Integrates with leading robot brands, including Universal Robots, FANUC, and ABB; supports leading industrial protocols, such as Ethernet/IP, IO-Link, and MQTT.
- Secured and connected: Supports pull-based, over-the-air upgrades with Wi-Fi and cellular connectivity, with an ISO27001- and NIST-800-certified cloud platform.
- Comprehensive software suite: Powered by the Vention Manufacturing Automation Platform, which includes MachineLogic, MachineCloud, MachineAnalytics, RemoteView, and Remote Support.
MachineMotion AI is available for pre-order and will be delivered in early 2025.
At Demo Day 2024, Vention also previewed an automated sanding application—powered by NVIDIA AI—that uses vision AI and is planned to be productized in 2025. In addition, Vention plans to integrate NVIDIA Isaac Sim, a reference application built on the NVIDIA Omniverse platform, and the NVIDIA Isaac Manipulator reference workflow to release tools that enable customers to easily deploy AI in their robotics applications.
STRADVISION
STRADVISION, a trailblazer in the automotive vision technology sector, is proud to announce a significant milestone in its production journey, surpassing 834,000 units in mass production during the first half of 2024. This outstanding achievement marks a 79% year-over-year growth, highlighting the company’s consistent expansion and the escalating demand for its groundbreaking vision perception technology, SVNet.
With over 2.65 million vehicles on the road equipped with STRADVISION’s state-of-the-art SVNet technology, the company continues to set the industry benchmark for ultra-lightweight and highly efficient deep learning-based solutions. This cumulative milestone underscores the global adoption of STRADVISION’s advanced driver-assistance systems.
“We are thrilled to announce that our production units have exceeded 834,000 in the first half of this year, a testament to our relentless pursuit of excellence and innovation,” said Philip Vidal, CBO of STRADVISION. “The 79% year-over-year growth reinforces our dedication to meeting our customers’ evolving needs and strengthens our leadership in the automotive vision technology market.”
Looking ahead, STRADVISION is on track to achieve over 1.84 million units in mass production by the end of 2024. This ambitious target is supported by the company’s strong portfolio of project wins, with 15 new projects secured so far this year. These successes emphasize STRADVISION’s strategic growth and its continued success in forging partnerships with leading automotive OEMs and Tier 1 suppliers worldwide.
As STRADVISION continues to drive innovation in the automotive industry, it remains focused on expanding its footprint in key markets. The company’s deep learning-based vision perception technology is not only enhancing vehicle safety and driving convenience but is also playing a pivotal role in shaping the future of autonomous driving.