Autonomous & Self-Driving Vehicle News: Waymo, Hyundai, Ford, Cruise, AMCI Testing, Helm.ai, Beep, Hesai & Sony

In autonomous and self-driving vehicle news are Waymo, Hyundai, Ford, Cruise, AMCI Testing, Helm.ai, Beep, Hesai and Sony.

Waymo to Deploy IONIQ 5

Hyundai Motor Company and Waymo announced they have entered into a multi-year, strategic partnership. In the first phase of this partnership, the companies will integrate Waymo’s sixth-generation fully autonomous technology – the Waymo Driver – into Hyundai’s all-electric IONIQ 5 SUV, which will be added to the Waymo One fleet over time.

The IONIQ 5 vehicles destined for the Waymo fleet will be assembled at the new Hyundai Motor Group Metaplant America (HMGMA) EV manufacturing facility in Georgia and then integrated with Waymo’s autonomous technology. The companies plan to produce a fleet of IONIQ 5s equipped with Waymo’s technology in significant volume over multiple years to support Waymo One’s growing scale. Initial on-road testing with Waymo-enabled IONIQ 5s will begin by late 2025 and become available to Waymo One riders in the years to follow.

“We are thrilled to partner with Hyundai as we further our mission to be the world’s most trusted driver,” said Tekedra Mawakana, co-CEO, Waymo. “Hyundai’s focus on sustainability and strong electric vehicle roadmap makes them a great partner for us as we bring our fully autonomous service to more riders in more places.”

“Hyundai and Waymo share a vision to improve the safety, efficiency and convenience of how people move,” said José Muñoz, president and global COO of Hyundai Motor Company, and president and CEO of Hyundai Motor North America. “Waymo’s transformational technology is improving road safety where they operate, and the IONIQ 5 is the ideal vehicle to scale this further. The team at our new manufacturing facility is ready to allocate a significant number of vehicles for the Waymo One fleet as it continues to expand. Importantly, this is the first step in the partnership between the two companies and we are actively exploring additional opportunities for collaboration.”

“We recently announced the launch of Hyundai Motor Company’s autonomous vehicle foundry business to provide global autonomous driving companies with vehicles capable of implementing SAE Level 4 or higher autonomous driving technology,” said Chang Song, President and Head of Hyundai Motor Group’s Advanced Vehicle Platform (AVP) Division. “There is no better partner for our first agreement in this initiative than industry-leader Waymo.”

The Hyundai IONIQ 5 will be delivered to Waymo with specific autonomous-ready modifications like redundant hardware and power doors. The award-winning, all-electric vehicle will enable long driving shifts on a single charge, and its 800-volt architecture will minimize time out of service with some of the industry’s fastest charging speeds available. The IONIQ 5’s well-appointed and spacious interior will offer plenty of legroom, headroom, and rear cargo space for a comfortable rider experience.

Waymo Rides in Austin

Waymo is starting to onboard riders in Austin from it interest list to try  fully autonomous ride-hail experience. Riders will travel across 37 square miles of the city, as we prepare for our commercial launch early next year – exclusively on the Uber app.

Ford Lowers BlueCruise1 Pricing

Ford is lowering the BlueCruise1 annual and monthly plan pricing for all U.S. customers with Ford BlueCruise-equipped vehicles and simplifying how new vehicle owners access hands-free highway driving, while continuing to offer flexible options.

BlueCruise will now be even simpler to activate on Ford vehicles with both a one-year plan and one-time purchase option. The one-year plan will either be included standard or as an option based on the vehicle line and trim. Ford customers can choose to upgrade to a one-time purchase and won’t need to activate BlueCruise again on their vehicle2.

In addition, Ford will continue to offer a 90-day complimentary trial to customers who do not select a one-year plan, if it is not included standard, or the one-time purchase at vehicle order3. Customers will be able to experience hands-free highway driving for 90 days and then have the flexibility to activate annually or monthly at the end of the trial based on their needs. For example, a customer could activate the service for one month for a road trip and not activate it again for another year, or could choose to activate it only during the holiday travel season.

BlueCruise availability will expand across 2025 model year vehicles and trims to give more customers who want an electric, hybrid or gas-powered Ford vehicle access to hands-free highway driving. In the U.S., BlueCruise is now available on: Ford Explorer, Ford Expedition, Ford F-150, F-150 Lightning, and Mustang Mach-E.

The details:

For 2025 model year vehicles, the one-year plan will either be included standard or as an option at vehicle order for $495, based on the vehicle line and trim.
Starting on select 2025 model year vehicle lines, the one-time purchase will be available at vehicle order for $2,495 MSRP.
At the dealership, customers who purchase a new 2024 or 2025 BlueCruise-equipped vehicle will be eligible to upgrade to the one-time purchase if they choose.
If eligible, the one-year plan and one-time purchase can also be rolled into the financing as part of the new vehicle purchase.
The new pricing plan for all Ford owners with BlueCruise-equipped vehicles is $495 annually and $49.99 monthly, effective October 1, 2024. BlueCruise customers with annual and monthly plans will see the price drop reflected on their next payment.
This change follows the roll out of the BlueCruise flexible offering last year, as well as the inclusion of BlueCruise hardware standard from the factory3. Owners who love BlueCruise for their daily commutes and long road trips are choosing to buy it upfront, and others are choosing to opt for the 90-day complimentary trial and then activate monthly or annually.

BlueCruise can help customers arrive at their destination less fatigued and more energized, whether on their daily commute in stop-and-go traffic or taking a long road trip. Globally, there are more than 492,000 BlueCruise-equipped vehicles on the road. Customers in the U.S. and Canada have also spent over 3.5 million hours enjoying and using BlueCruise and have driven more than 244 million hands-free miles.

Cruise to Pay NTHSA $1.5 Million

The US United States Department of Transportation National Highway Traffic Safety Administration has set an order to cruise to pay one million five hundred dollars for not filing the correct reports after the pedestrian was hit by a cruise autonomous vehicle and dragged.To administratively resolve the issues, NHTSA and Cruise have mutually agreed to this Consent Order and settlement payment.

AMCI  Testing Finds Tesla FSD Potentally Dangerous

AMCI Testing continues its extensive, 1000-mile evaluation of the Tesla Full Self Driving (Supervised) system in advance of the company’s October 10th Robotaxi reveal event. Six newly filmed driving scenarios are available at the link below. As with previous videos, these continue to show that, although FSD can often drive the car competently for limited distances across a wide range of scenarios, the mistakes it does make continue to put occupants and the public at significant risk.

Beyond FSD’s actual performance in the instances shown, these potentially dangerous driving errors demonstrate the incontrovertible need for regulation and agreed upon metrics for comparable, system-to-system evaluation. One of the key metrics to define across all types of autonomous and semi-autonomous systems is what constitutes an “intervention” – which system actions warrant one, and how they should be scored by regulators whenever one occurs.

When AMCI Testing Intervenes

AMCI Testing’s protocol requires an intervention (the driver taking control and forcing disengagement of the system) whenever FSD’s actions put the occupants, the public or other motorists at risk. As stated in our previous release, FSD’s behavior required 75 interventions in 1000 miles of real-world testing, for an average forced system disengagement rate of 1 every 13 miles.

” The key consideration is, currently can any hands-free system be ethically operated by consumers on public roads. To advance the safety of the industry, AMCI Testing has now articulated a standard. But really that is the question- should we or the public be generating a standard or is this the responsibility of a Federal or State Regulator to arrive at an “intervention” standard? asked David Stokols, CEO of parent company, AMCI Global. “Further, if there are too many incidents, as we have seen in AMCI Testing’s results then the public will lose confidence in all FSD and Robotaxi-type software solutions from any OEM.”

We have found FSD’s evolving programming and unexpected changes between software versions as proof of the critical need for more specific regulation and oversight. The obvious example is when “Autoland” was being developed for airliners in the mid-1970s, designed to allow zero-visibility operation with many fewer variables than occur on a public road. Certification required a failure probability per occurrence of less than 1 in 150,000.

Arguably, the 1 in 150,000 goal is what  we should be aiming for in road-based autonomous systems. “Extrapolate the failure rate AMCI testing experienced in only 1000 miles of driving with FSD (Supervised) and you can see the Tesla system is nowhere near that mark. Additionally, FSD does not appear to be on a progressive, problem-solving track. There are inexplicable performance regressions that sometimes occur as the software updates,” said Guy Mangiamele, Director of AMCI Testing.

“For example, during our testing on version 12.5.1, FSD would only command the car out of the left lane on the freeway and toward the exit ramp 0.2 miles before the ramp itself. This distance is already too short. Yet in the later 12.5.3 update, the distance was even further reduced to just 0.1 mile, regardless of the surrounding traffic. Not only does this unpredictable sort of behavior keep the user guessing about the system’s intentions, but it would make the system’s performance nearly impossible to assess within a Certification-program architecture.”

AMCI Testing has dropped the next 3 in the series of test-videos intended to demonstrate the complex issues of trust and performance that FSD continues to pose to drivers and the public. Please follow the link for all the test-videos released to date at https://amcitesting.com/tesla-fsd/ We will continue to evaluate subsequent iterations of Tesla’s FSD as they become available. Go to www.amcitesting.com to sign up to receive updates as they occur.

Helm.ai Launches VidGen-2

Helm.ai, a leading provider of advanced AI software for high-end ADAS, autonomous driving, and robotics automation, announced the launch of VidGen-2, its next-generation generative AI model for producing highly realistic driving video sequences. VidGen-2 offers 2X higher resolution than its predecessor, VidGen-1, improved realism at 30 frames per second, and multi-camera support with 2X increased resolution per camera, providing automakers with a scalable and cost-effective solution for autonomous driving development and validation.

“These advancements enable us to generate highly realistic driving scenarios while ensuring compatibility with a wide variety of automotive sensor stacks. The improvements made in VidGen-2 will also support advancements in our other foundation models, accelerating future developments across autonomous driving and robotics automation.”

Trained on thousands of hours of diverse driving footage using NVIDIA H100 Tensor Core GPUs, VidGen-2 leverages Helm.ai’s innovative generative deep neural network (DNN) architectures and Deep Teaching™, an efficient unsupervised training method. It generates highly realistic video sequences at 696 x 696 resolution, double that of VidGen-1, with frame rates ranging from 5 to 30 fps. The model also enhances 640 x 384 resolution video quality at 30 fps, delivering smoother and more detailed simulations. Videos can be generated by VidGen-2 without an input prompt or with a single image or input video as the prompt.

VidGen-2 also supports multi-camera views, generating footage from three cameras at 640 x 384 (VGA) resolution for each. The model ensures self-consistency across all camera perspectives, providing accurate simulation for various sensor configurations.

The model generates driving scene videos across multiple geographies, camera types, and vehicle perspectives. The model not only produces highly realistic appearances and temporally consistent object motion, but also learns and reproduces human-like driving behaviors, simulating the motions of the ego-vehicle and surrounding agents in accordance with traffic rules. It creates a wide range of scenarios, including highway and urban driving, multiple vehicle types, pedestrians, cyclists, intersections, turns, weather conditions, and lighting variations. In multi-camera mode, the scenes are generated consistently across all perspectives.

VidGen-2 gives automakers a significant scalability advantage over traditional non-AI simulators by enabling rapid asset generation and imbuing agents in simulations with sophisticated, real-life behaviors. Helm.ai’s approach not only reduces development time and cost but also closes the “sim-to-real” gap, offering a highly realistic and efficient solution that broadens the scope of simulation-based training and validation.

“The latest enhancements in VidGen-2 are designed to meet the complex needs of automakers developing autonomous driving technologies,” said Vladislav Voroninski, Helm.ai’s CEO and founder. “These advancements enable us to generate highly realistic driving scenarios while ensuring compatibility with a wide variety of automotive sensor stacks. The improvements made in VidGen-2 will also support advancements in our other foundation models, accelerating future developments across autonomous driving and robotics automation.”

Beep Inc. Launches C.A.B

Beep, Inc., a leading provider of autonomous shared mobility solutions, announced the public launch of C.A.B. or Campus Autonomous Bus, at Mississippi State University (MSU), marking Mississippi and Southeastern Conference’s (SEC) first-ever autonomous pilot program.

Celebrated on Sept. 20 at a ribbon cutting ceremony on MSU’s campus, the pilot completed weeks of continued testing and validation and is now available to all students, faculty and guests of MSU for transit. C.A.B. is currently scheduled to operate through the end of the year and will give MSU a chance to evaluate how autonomous transportation systems can be used on campus to diversify its existing fleet of transportation assets. MSU is also researching how electric and shared autonomous mobility can be used in rural-urban environments.

“MSU is a premier educational institution with a great transportation network, and as leaders in innovative transit, we are always on the lookout for new mobility technologies. This is why we are so excited to learn firsthand how Beep’s autonomous shuttles can provide augmented and extended transportation options to our students, faculty, and city,” said Jeremiah Dumas, MSU’s Executive Director of Transportation. “The data we gather from this pilot program will help us better understand riders’ perceptions of autonomous transportation, and how these solutions can provide convenient ways for students and faculty to get to their destination safely and efficiently.”

The C.A.B. pilot program consists of two electric autonomous Beep shuttles, with one operating at a time along a 2.4-mile route that includes five different stops at key destinations: Old Main, Giles, College View, Cotton District, and Sanderson Center. The C.A.B. is scheduled to operate daily from 12:00 to 8:00 p.m. pending environmental impacts and throughout the project, MSU will be collecting input from riders about the quality of the service, routes, ridership stats and other data points.

“The launch of Mississippi’s first-ever autonomous shuttle project is a remarkable milestone that highlights the state’s leadership in advanced mobility technologies, and we are proud of how we were able to help make this a reality for MSU and the state,” said Beep’s Chief Revenue Officer Toby McGraw. “As a renowned organization with several leading technology programs, MSU is the perfect location to test and provide advanced autonomous mobility solutions for its students and faculty. We are confident this pilot program will show how autonomous mobility can augment existing transportation systems and the overall benefits of shared mobility solutions on college campuses.”

The two Beep shuttles can carry 10 seated and secured passengers, plus an onboard attendant who provides passengers with information about the pilot program and assists with passenger safety. All shuttles are ADA-compliant and feature a manually deployable ramp which is operated by the attendant. While in operation, the shuttles are monitored by the Beep Command Center at its headquarters in Lake Nona of Orlando, Florida.

Hesai Tech Patners is SAIC Volkswagen

Hesai Technology (Nasdaq: HSAI), the global leader in lidar technology, announced  the signing of a cooperation framework with SAIC Volkswagen.

The cooperation will establish a strong foundation for Hesai and SAIC Volkswagen to build a new ecosystem for the automotive industry, with Hesai’s advanced lidar products and technology at the core.

SAIC Volkswagen Automobile Co., Ltd. is a Sino-German joint venture operated by SAIC Group and Volkswagen Group. Formed in October 1984, SAIC Volkswagen was one of the first automotive joint ventures in China and has maintained strong market competitiveness over the past 40 years. SAIC Volkswagen sold 1.2 million cars in 2023, and the cumulative car sales from January to August of this year is nearly 700,000.

Hesai Technology is the global leader in lidar products and technology. It integrates the manufacturing process of lidar into the R&D design process, ensuring high performance, high reliability and cost-effectiveness while promoting rapid product iteration. Hesai is also the industry leader in global patents and has accumulated more than 1,700 patents and patent applications worldwide to date.

According to the Lidar for Automotive 2024 report by Yole Intelligence, an international market research and strategy consulting firm, Hesai is ranked No. 1 in the robotic car lidar market globally, with a 74% market share. Among the top 10 autonomous driving companies in the world, nine use Hesai’s 360° high-performance lidar. As of the second quarter of 2024, Hesai has delivered over 470,000 units to customers.

Sony Semi Releases ISX038 CMOS

Sony Semiconductor Solutions Corporation (SSS) today announced the upcoming release of the ISX038 CMOS image sensor for automotive cameras, the industry’s first*1 product that can simultaneously process and output RAW*2 and YUV*3 images.

The new sensor product has proprietary ISP*4 inside and can process and output RAW and YUV images simultaneously. RAW images are required for external environment detection and recognition in advanced driver-assistance systems (ADAS) and autonomous driving systems (AD), while the YUV images are provided for infotainment applications such as the drive recorder and augmented reality (AR).

By expanding the applications a single camera can offer, the new product helps simplify automotive camera systems and saves space, cost, and power.

*1   Among CMOS sensors for automotive cameras. According to SSS research (as of announcement on October 4, 2024).
*2   Image for recognition on a computer.
*3   Image for driver visual such as recording or displaying on a monitor.
*4   Image signal processor – a circuit for image processing.

Model name

Sample
shipment date
(planned)

Sample price
(including tax)

ISX038 1/1.7-type (9.30 mm diagonal)
8.39- effective-megapixel*5
CMOS image sensor

October 2024

¥15,000*6

*5   Based on the image sensor effective pixel specification method.
*6   May vary depending on the volume shipped and other conditions.

The roles of automotive cameras continue to diversify in line with advances in ADAS and AD and increasing needs and requirements pertaining to the driver experience. On the other hand, there is limited space for installing such cameras, making it impossible to continue adding more indefinitely, which in turn has created a demand to do more with a single camera.

The ISX038 is the industry’s first*1 CMOS image sensor for automotive cameras that can simultaneously process and output RAW and YUV images. It uses a stacked structure consisting of a pixel chip and a logic chip with signal processing circuit, with the SSS’ proprietary ISP on the logic chip. This design allows a single camera to provide high-precision detection and recognition capabilities of the environment outside the vehicle and visual information to assist the driver as infotainment applications. When compared with conventional methods such as a multi-camera system or a system that outputs RAW and YUV images using an external ISP, the new product helps simplify automotive camera systems, saving space, costs, and power.

ISX038 will offer compatibility with the EyeQ™6 System-on-a-Chip (SoC) currently offered by Mobileye, for use in ADAS and AD technology.

Main Features

  • Industry’s first*1 sensor capable of processing and outputting RAW and YUV images simultaneously
    The new sensor is equipped with dedicated ISPs for RAW and YUV images and is capable of outputting two types of images simultaneously with image quality optimized for each application on two independent interfaces. Expanding the applications a single camera can offer helps build systems that save space, costs, and power compared to multi-camera systems or systems with an external ISP.
  • Wide dynamic range even during simultaneous use of HDR and LED flicker mitigation
    In automobile driving, objects must be precisely detected and recognized even in road environments with significant differences in brightness, such as tunnel entrances and exits. Automotive cameras are also required to suppress LED flicker, even while in HDR mode, to deal with the increasing prevalence of LED signals and other traffic devices. The proprietary pixel structure and unique exposure method of this product improves saturation illuminance, yielding a wide dynamic range of 106 dB even when simultaneously employing HDR and LED flicker mitigation (when using dynamic range priority mode, the range is even wider, at 130 dB). This design also helps reduce motion artifacts*7 generated when capturing moving subjects.

*7   Noise generated when capturing moving subjects with HDR.

  • Compatibility with conventional products*8
    This product shares the same compatibility with SSS’ conventional products,*8 which have already built a proven track record for ADAS and AD applications with multiple automobile manufacturers. The new product makes it possible to reuse data assets collected on previous products such as driving data from automotive cameras. This helps streamline ADAS and AD development for automobile manufacturers and partners.

*8 SSS’ IMX728 1/1.7 type 8.39 effective megapixel CMOS image sensor.

  • Compliant with standards required for automotive applications
    The product is qualified for AEC-Q100 Grade 2 automotive electronic component reliability tests by mass production. Also, SSS has introduced a development process compliant with the ISO 26262 road vehicle functional safety standard, at automotive safety integrity level ASIL-B(D). This contributes to improve automotive camera system reliability.

Key Specifications

Model name

ISX038

Effective pixels

3,857×2,177(H×V), approx. 8.39 megapixels

Image size

Diagonal 9.30mm (1/1.72-type)

Unit cell size

2.1μm×2.1μm (H×V)

Frame rate (all pixels)

30fps (RAW&YUV dual output)

Sensitivity (standard value F5.6, 1/30 second
cumulative)

880mV (Green Pixel)

Dynamic range (EMVA 1288 standard)

106 dB (with LED flicker mitigation)

130 dB (dynamic range priority)

Interface

MIPI CSI-2 serial output (Single port with 4-
lanes / Dual port with 2-lanes per port)

Package

192pin BGA

Package size

11.85mm×8.60mm (H×V)