Autonomous & Self-Driving Vehicle News: NYC, Waymo, TIER IV &  AEye

In autonomous and self driving vehicle news are NYC, Waymo, TIER IV and  AEye.

Robotaxis with Safety Drivers Coming to New York City

New York City Mayor Eric Adams and New York City Department of Transportation (DOT) Commissioner Ydanis Rodriguez released robust safety requirements for responsible and safe autonomous vehicle (AV) testing in New York City, as well as announced the opening of applications for a new permit program. DOT has established a rigorous permitting program to ensure approved applicants are ready to test their technology in the country’s most challenging urban environment safely and proficiently. Consistent with state law, a trained safety driver will still be required to sit behind the wheel and be ready to take control of an AV-enabled vehicle at all times.  

“New York City leads the nation in responsible innovation, and we’re continuing to do so with this new autonomous vehicle program,” said Mayor Adams. “Our streets are vibrant and energetic — and that’s a great thing, but it also means that we need to have strong guardrails and requirements in place on any sort of autonomous vehicles. That’s why we’re implementing these rigorous permit requirements and requiring close collaboration with DOT and our emergency responders, so we can ensure that autonomous vehicle technology works for New York City. This technology is coming whether we like it or not, so we’re going to make sure that we get it right. If we do, our streets can be safer, and our air could be cleaner.”  

“Autonomous vehicles aren’t just coming, they’re here — they’ve been successfully operating across the country for years. We are doing our due diligence to get ahead of the AV revolution, and ensure that if AVs are coming, they do so within a framework that benefits New Yorkers, and creates training and good, upwardly mobile jobs in the autonomous industry,” said Deputy Mayor for Operations Meera Joshi.  “It’s been the story for too long that government can’t keep up with private enterprise. No longer. With careful regulation, we believe that they have the potential to benefit a city as complex as New York.”    

“As autonomous vehicle technology expands across the country, DOT is deploying a robust new permitting process to ensure safe, responsible testing on our city’s streets,” said DOT Commissioner Rodriguez. “Driver error and distraction plays an all-too-common role in traffic crashes, and autonomous vehicle technology offers potential to improve traffic safety. We look forward to working closely with AV companies that are serious about safely operating on America’s most challenging street network.”   

The city’s approach prioritizes safety and accountability in AV testing. As part of the approval process, applicants must submit information on their previous testing experience and technological capabilities, a detailed testing plan for New York City, and a safety plan describing key elements that will contribute to the safe operation of their technology on city streets.  

Companies must also follow industry best practices related to the recruitment and training of the safety drivers that must be present in the vehicle during testing. Additionally, applicants must obtain approval from the New York State Department of Motor Vehicles before starting an on-street testing program. The intent of this program is to create a roadmap for a collaborative future of AV testing and potential deployment in New York City, one in which AV companies work closely with the city to support a vision for a safe, sustainable, equitable, and efficient transportation system for all.    

As part of the city’s safety protocols, applicants must provide details on how their test operators are selected and trained, and attest that they will follow recent best practices from the Society of Automotive Engineers. These practices include background checks for test operators, appropriate training on the vehicle systems they will be testing, and working conditions and frequent breaks that will keep them focused on the job and avoid distraction. Applicants must also certify that the vehicles will follow all traffic laws and curb regulations and include safety assurance protocols for how the operator will compensate for any AV system limitation or failure and proactively intervene to avoid potential crashes.  

Applicants will be required to coordinate closely with DOT through regular meetings and data reporting, as well as to engage with the New York City Police Department and the Fire Department of the City of New York on how their autonomous vehicles interact with emergency vehicles. Companies must also certify that they are adhering to industry best practices related to cybersecurity. 

Applicants will additionally be required to provide a detailed overview of the level of automation and safety performance of their AV technology, including previous testing and crash histories of their AV technology. Further, applicants will have to provide a list of all current or past permits to test their technology in any local, state, or foreign government agency.  

DOT will meet regularly with program participants in the lead-up to and during testing to assess testing plans, identify potential challenges, and monitor compliance with the terms of the permit. Participants in the testing program will be required to report testing data, including incidents where test drivers take control over from the AV technology, on a regular basis.  

“Waymo has had a longstanding relationship with New York City, where we have previously manually driven and tested for winter weather,” said Michelle Peacock, global head of public policy, Waymo. “We commend this important step, as well as the city’s ongoing commitment to drive innovation and deliver transportation improvements for New Yorkers. As the world’s leading autonomous vehicle ride hailing company, Waymo looks forward to continuing our partnership with Mayor Adams and his administration as we continue to safely bring our technologies to more cities and communities across the country.” 

TIER IV Picked by JAXA

TIER IV, a pioneer in open-source autonomous driving (AD) technology, is proud to announce its selection in the Japan Aerospace Exploration Agency (JAXA) Space Exploration Innovation Hub‘s 11th “Open Innovation Hub for Expanding Humanosphere and Domain of Human Activity through Solar System Frontier Development” Request for Proposal (RFP) . This marks a pivotal step in utilizing high-quality sensor simulation technology with neural radiance fields (NeRF)*1 for enhancing large-scale AD environments.

Research overview

This research aims to create high-quality and large-scale digital twins to simulate AD environments, capitalizing on the groundbreaking NeRF world model. The project is set to develop a neural simulator capable of pre-constructing NeRF models from camera and LiDAR sensor data from autonomous vehicles, paving the way for realistic and cost-effective sensor simulations. The goal is to create an AD simulation environment adaptable both on Earth and in space.

Future outlook

Scheduled to kick off in April 2024, this research underscores TIER IV’s role as a tech company firmly committed to reimagining intelligent vehicles through innovation. The company is posed to amplify its efforts to advance AD technology and foster the realization of a sustainable society with each breakthrough.

Insights into this approach

Technological background

Since 2020, TIER IV has partnered with the Matsuo Institute to pioneer the development of AI technology for autonomous driving, conducting basic component development into world models. World models can be used to approximate complex systems based on observations. By applying the models to autonomous vehicles, the companies aim to enable prediction of future conditions based on current sensor observations, and planning and learning of driving based on those predictions.

World model overview

Through collaboration, TIER IV and the Matsuo Institute aim to drive the development of a scalable world model, utilizing multimodal data from autonomous vehicles, such as sensor, mapping and scenario data. The collaboration will also incorporate recent research involving large language models (LLMs) and NeRF, and the companies will build their own world model with their know-how and resources. The goal is to use the world model for prediction and driving planning in AD systems, including robotaxis, as well as for safety evaluations and AI model training.

“World models have the potential to solve various challenges in autonomous driving, including robotaxis,” said Shinpei Kato, founder, CEO and CTO of TIER IV. “By leveraging TIER IV’s strengths in the development of AD technology in cooperation with the research institute and the educational institution, we will work to develop the technology and implement it in society. As the company leading the development of Autoware*2, we are committed to openly sharing this cutting-edge technology to create a foundation for more partners to develop technology not only for Earth but also for space.”

“The Matsuo Institute has been promoting the societal implementation of advanced technologies created in academia through joint research with the private sector, aiming to create a spiral of innovation through industry-academia collaboration,” commented Tofuku Kawakami, the president and representative director of the Matsuo Institute. “The purpose of the joint project with TIER IV is to realize autonomous driving by researching and developing the latest recognition technology using deep learning and world models, which have attracted a lot of attention recently. We aim to implement advanced technologies at high speed and promote initiatives to maximize their impact on society.”

“World models are a technological approach to modeling human intelligence in relation to the external environmentand companies in many countries have begun paying attention to and utilizing the technology,” remarked Prof. Yutaka Matsuo of the University of Tokyo’s Matsuo Laboratory. “I believe various services will become smart, intelligent and autonomous as world models get implemented in society in the future. I look forward to TIER IV making everyday life more comfortable through autonomous driving.”

*1 A technology that uses multiple photographs taken from various angles to reconstruct 3D scenes and generate images from new perspectives.
*2 Autoware is a registered trademark of the Autoware Foundation

AEye Intros Apollo

AEye, Inc. (NASDAQ: LIDR), a global leader in adaptive, high-performance lidar solutions,  announced Apollo, the first product in AEye’s 4Sight™ Flex next-generation family of lidar sensors. Apollo delivers best-in-class range and resolution in a small, power-efficient, low-cost form factor, enabling both automotive and non-automotive applications. For L2+, L3, and L4 applications, this new sensor supports options for integration behind the windshield, on the roof, or in the grille, which enables OEMs to implement critical safety features with minimal impact to vehicle design. Apollo is believed to be the only 1550 nm high-performance lidar capable of behind the windshield integration.

AEye CEO Matt Fisch, said, “We are pleased to announce the first product from the 4Sight Flex family a mere four months after unveiling the initial reference design. With Apollo, we are able to demonstrate the true advantage of our ultra-long-range lidar delivered in an incredibly compact form factor. This achievement underscores the scalability of our software-defined architecture.”

Apollo supports up to 120° horizontal and 30° vertical field of view, with long-range detection of up to 325 meters at 10% reflectivity and up to 6.2 million points per second (PPS). Apollo also provides horizontal and vertical resolutions within a region of interest as high as 0.025°.

Apollo leverages proven components and supply chain partners, resulting in minimized technical risk, maximized supply chain readiness, and a very competitive price for performance. As part of the 4Sight Flex family, Apollo runs on AEye’s 4Sight Intelligent Sensing Platform. This platform delivers a highly programmable lidar solution that can be customized easily for each application and can be reconfigured through software, including over-the-air updates.

Apollo samples are expected to be available for customer demonstrations in June 2024.