Abstract
Safety remains a critical challenge for transportation systems, particularly in environmental conditions where traditional sensors like cameras and LiDAR fail due to fog, rain, snow, or poor lighting. This proposal addresses DOT's safety grand challenge by developing a robust RF-based perception system that enables reliable vehicle sensing in all weather conditions. Our key insight is that mmWave radar and vision sensors have complementary strengths - while cameras offer superior spatial resolution for detailed scene understanding, radar provides unmatched capabilities in adverse weather conditions with precise depth measurements. Through our next-generation RF imaging system design, our goal is to achieve the best of both worlds: the high spatial resolution traditionally only available with optical sensors and the all-weather robustness of RF sensing. Our RF imaging system achieves this by rotating a compact mmWave radar to create a synthetic aperture, precisely tracking the platform's motion to maintain coherent imaging, and leveraging machine learning to enhance resolution and recover fine scene details. This approach achieves LiDAR-comparable 3D sensing accuracy while being inherently robust to environmental conditions. Unlike existing automotive radar systems that provide only crude point clouds, our high-resolution RF imaging delivers detailed 3D scene reconstruction with high-fidelity object detection. Through this work, we will develop and evaluate an integrated perception system that enhances vehicle safety through: 1) High-resolution 3D RF imaging with quantified uncertainty, 2) Robust object detection and semantic understanding, and 3) Real-time operation for automotive deployment. Our preliminary results demonstrate successful sensing and object detection through fog where traditional sensors fail. Field testing will be conducted at selected urban intersections and highway segments where adverse weather conditions frequently impact traffic safety, allowing us to validate the system's performance in real-world scenarios. By focusing on challenging weather conditions where traditional sensors struggle, we aim to demonstrate a robust perception solution that can significantly enhance vehicle safety year-round.
Description
Timeline
Strategic Description / RD&T
Section left blank until USDOT’s new priorities and RD&T strategic goals are available in Spring 2026.
Deployment Plan
The proposed work will involve the following key deployment milestones:
Q1) The development of the high-resolution RF imaging system will be performed. We will implement the rotating radar platform with motion tracking capabilities and develop the core imaging algorithms. The design will be informed by simulation studies to understand anticipated system performance.
Q2) The integration of the real-time RF imaging system will be completed and initial prototype testing will be performed in controlled settings. The system's performance will be characterized in both clear and adverse weather conditions.
Q3) A detailed evaluation of the RF imaging system will be performed to validate performance metrics including spatial resolution, depth accuracy, and detection reliability. A comprehensive plan will be developed for vehicle deployment and safety validation.
Q4) The RF imaging system will be deployed on test vehicles for real-world validation in challenging weather conditions. System performance metrics will be thoroughly documented and analyzed to inform deployment guidelines for automotive safety applications. We will work with SEPTA to mount the system on trolleys and busses to provide blind spot detection.
Expected Outcomes/Impacts
The proposal directly addresses automotive safety by developing robust perception capabilities for adverse weather conditions. Our high-resolution RF imaging technology will establish a new paradigm for reliable sensing in fog, rain, and snow where traditional sensors fail. The proposed work will create a foundation for all-weather vehicular safety systems using low-cost, commercially available radar hardware.
The research will advance the state-of-the-art in RF imaging by demonstrating LiDAR-comparable resolution through mechanical rotation, precise motion tracking, and learning-based enhancement. This represents a significant leap beyond current automotive radar capabilities, bringing RF sensing resolution close to that of LiDAR while maintaining operation in all weather conditions.
The proposed work will shape future automotive safety requirements and influence sensor specifications for advanced driver assistance systems.It will demonstrate reliable perception in challenging conditions using commodity hardware and establish practical paths for widespread adoption of all-weather radar technology.
Expected Outputs
Research Publications: The proposed work will advance fundamental knowledge in high-resolution RF imaging and automotive safety systems. Research findings will be published in top transportation and sensing venues.
Open-Source Software: Core algorithms including motion estimation, uncertainty quantification, and 3D RF imaging will be released to accelerate adoption of robust perception technology. We will integrate the sensors with Autoware to demonstrate it on an autonomous vehicle driving on roads.
Systematic Evaluation: Systematic evaluation data will document system performance in various weather conditions, providing quantitative evidence of reliability
compared to existing sensors.
Student Training: Graduate students will be trained in transportation safety systems and robust perception technology, contributing to the future workforce in autonomous vehicle development.
TRID
In this project we propose a fundamentally new approach to achieve high-resolution perception using automotive radar. Traditional automotive radars were designed primarily for collision warning or basic object detection, not high-resolution imaging. While the TRID has related projects using automotive radar for perception, they all work with conventional radar hardware that fundamentally lacks the resolution for detailed scene understanding. Our key innovation lies in transforming commodity automotive radar into a high-resolution imaging sensor through mechanical rotation and precise motion tracking, achieving imaging accuracy comparable to LiDAR while maintaining reliable operation in fog, rain, and snow. This represents the first demonstration of LiDAR-comparable resolution using radar hardware that costs orders of magnitude less than LiDAR. While some projects explore sensor fusion between radar and other sensors to compensate for radar's low resolution, none address the fundamental resolution limitations of radar itself. Our ability to achieve both high resolution and weather resilience by reimagining how automotive radar can be used represents a unique contribution to transportation safety technology.
Individuals Involved
| Email |
Name |
Affiliation |
Role |
Position |
| rahulm@seas.upenn.edu |
Mangharam, Rahul |
University of Pennsylvania |
PI |
Faculty - Tenured |
| mingminz@seas.upenn.edu |
Zhao, Mingmin |
University of Pennsylvania |
Co-PI |
Faculty - Untenured, Tenure Track |
Budget
Amount of UTC Funds Awarded
$110000.00
Total Project Budget (from all funding sources)
$210000.00
Documents
| Type |
Name |
Uploaded |
| Data Management Plan |
UPenn_DMP.pdf |
Nov. 22, 2024, 5:14 a.m. |
Match Sources
No match sources!
Partners
| Name |
Type |
| The Autoware Foundation |
Deployment Partner Deployment Partner |
| SEPTA |
Deployment Partner_ Deployment Partner_ |