Robust Sensing, Learning, and Autonomy for Real-World Robots

Laboratory for Intelligent Sensing and Computing (LISC) studies how autonomous systems can sense, understand, and act safely in the real world, especially under adverse weather, unstructured terrain, and incomplete information. We work at the interface of signal processing, machine learning, and robotics, with a particular emphasis on high-resolution mmWave radar as a key enabling modality for all-weather, all-terrain operation. Using full-scale platforms including a Lexus RX450h autonomous vehicle and an AgileX Bunker Pro UGV, we develop end-to-end systems that span sensing, perception, SLAM, and embodied autonomy.


Research Thrusts

1. High-Resolution Sensing and 4D Perception

Robust autonomy begins with reliable sensing. We design algorithms and systems for high-resolution 4D perception that combine mmWave MIMO radar, LiDAR, and cameras to recover rich range–Doppler–angle–elevation information in challenging environments. Our work includes sparse and non-uniform array design, beamspace processing, super-resolution imaging, and joint estimation of spatial and motion parameters. While our methods are broadly applicable to sensing systems, mmWave radar provides a unique advantage in rain, fog, dust, and cluttered scenes.

2. Physics-Guided Machine Learning for Sensing and Perception

Many key problems in sensing and perception are governed by well-understood physics but must operate under uncertainty and data limitations. We develop physics-guided machine learning methods that integrate electromagnetic and geometric models with modern deep networks. This includes algorithm-unrolled networks for radar imaging, hybrid model-based and data-driven architectures for interference mitigation and ghost-target suppression, diffusion models for denoising and artifact removal, and neural field representations of 4D scenes. Our goal is to build interpretable, robust, and generalizable perception pipelines that leverage physical structure rather than treat sensing as a black box.

3. Robust Localization, SLAM, and Spatial Intelligence

Autonomy in complex environments requires more than instantaneous perception—it demands spatial intelligence over time. We study radar-centric and multimodal SLAM, odometry, and localization to enable operation in GPS-denied, adverse-weather, and off-road settings. Our work includes IMU-free radar odometry, elevation-aware SLAM on uneven terrain, building-geometry-guided height estimation, and long-term mapping under multipath and environmental change. These methods aim to make localization reliable not only on highways and flat roads, but also in rural, agricultural, and disaster-response scenarios.

4. Multimodal Fusion, Large-Model Reasoning, and Embodied Autonomy

Modern autonomous systems must reason over heterogeneous data and make safe decisions in uncertain environments. We explore multimodal sensor fusion across radar, LiDAR, cameras, and GNSS/IMU, as well as the use of large models for language-driven reasoning and high-level decision support. Current projects include radar-conditioned bird’s-eye-view (BEV) perception, language-guided interpretation of multimodal scenes, and “conscious” radar–AI agents that connect physical measurements to semantic understanding and action. Our ambition is to develop embodied autonomous systems that are robust, explainable, and deployable across transportation, agriculture, and environmental resilience missions.


Future Directions

Looking ahead, our long-term vision is to leverage high-resolution sensing and physics-guided learning to create radar-powered intelligent agents that can perceive, reason, and act reliably in complex real-world environments. Three themes will guide our future work:

  • Conscious radar–AI agents: integrating 4D sensing with large-model-driven reasoning to enable autonomous systems that can interpret their surroundings, explain their decisions, and adapt to new conditions.
  • Radar-enabled world models: building long-horizon, multimodal world models that support persistent mapping, memory, and prediction in GPS-denied, all-weather, and off-road settings.
  • Scalable autonomy across platforms and domains: extending radar-centric autonomy from single vehicles to multi-robot teams and deploying these capabilities in transportation, agriculture, and environmental resilience applications.

Experimental Platforms

Lexus RX450h Autonomous Vehicle

LISC has a 2021 Lexus RX450 Hybrid SUV (mounted with millimeter-wave high-resolution automotive imaging radars, stereo cameras, LiDAR, RTK GNSS and IMU) as vehicle platform for autonomous driving.
  • 4D imaging radars (TI & Continental)
  • Velodyne VLP-32C LiDAR, FLIR stereo camera
  • NovAtel GNSS/IMU + NVIDIA GPU computing

AgileX Bunker Pro UGV

LISC holds a unmanned ground vehicle (UGV) – Bunker Pro for off-road mobility research.
  • 4D radar + Ouster OS1-128 LiDAR + Hesai JT-128 LiDAR
  • Stereo camera + PwrPak7D GNSS/IMU
  • Nvidia Jetson AGX Orin

Recent Highlights

  • Bunker Pro field test:
  • 4D radar odometry demonstration (IMU-free)

4D radar odometry of the Engineering Quad at The University of Alabama (left), shown alongside an aerial reference image (right). The red line is the estimated radar trajectory. The experiment was conducted using the Altos V2 4D radar, and the odometry reconstruction was generated without incorporating any inertial measurement unit (IMU) data.

  • Model-Based Neural Networks for High-Resolution Radar Imaging

R. Zheng, S. Sun, H. Liu, H. Chen and J. Li, ‘‘Model-based knowledge-driven learning approach for enhanced high-resolution automotive radar imaging,’’ IEEE Transactions on Radar Systems, vol. 3, pp. 709-723, 2025.

  • Physics-Guided Machine Learning for Perception
R. Zheng, S. Sun, H. Liu and T. Wu, ‘‘Deep neural networks-enabled vehicle detection using high-resolution automotive radar imaging,” IEEE Transactions on Aerospace and Electronic Systems, 2023.

Funding Support

NSF CAREER, NSF CRII, NOAA, and Industry Research Contracts.

  • PI: University of Alabama (UA) and Ben-Gurion University (BGU) Joint Seed Funding Program, ‘‘Deep Radar SLAM Using High-Resolution Radar Imaging via Domain-Aware Machine Learning Models’’, 1/1/2026-12/31/2026, ($15K, total $30K, BGU PI: Dr. Igal Bilik)
  • PI: National Science Foundation, ‘‘CAREER: Towards Fundamentals of Adaptive, Collaborative and Intelligent Radar Sensing and Perception’’, 8/15/2024-7/31/2029, ($506K)
  • PI: National Science Foundation, ‘‘CRII: CIF: A Sparse Framework Based Automotive Radar Sensing for Autonomous Vehicles’’, 5/1/2022-4/30/2025, ($175K)
  • PI: NXP Semiconductors, ‘‘Advanced Automotive Radar Signal Processing Research”, 9/1/2020-12/31/2025, ($760K)
  • PI: Spartan Radar, ‘‘High Resolution Imaging Radar Research”, 1/1/2022-12/31/2024, ($178K)
  • PI: Mathworks, ‘‘Sparse Array Design for 4D Automotive Radar”, 1/1/2022-12/31/2023, ($73K)
  • PI: The Alabama Transportation Institute (ATI),  ‘‘Laboratory for Intelligent Sensing and Computing”, 10/1/2021-9/30/2022, ($50K)
  • PI: The Alabama Transportation Institute (ATI), Graduate Student Support for 2021-2022 Academic Year, 8/16/2021-5/15/2022, ($35K)​
  • Co-PI: Camgian Microsystems, Inc, ‘‘Machine Learning for Object Classification and Localization with Counter Measures’’, 4/1/2021 −8/31/2022, ($138K, total $600K, PI: Dr. Kenneth Ricks)
  • ​Co-PI: NOAA/UCAR, U.S. Dept. of Commerce,  ‘‘Center for Remote Sensing of Snow and Soil Moisture’’ ​, 9/1/2019-8/30/2020, ($250K, total $5M, PI: Dr. Prasad Gogineni)

Sponsors