Multi-Modal fusion for estimation in perceptually degraded conditions
Abstract
For autonomous robots to transition from controlled laboratory settings to unpredictable real-world environments, an essential component is resiliency against perceptual degradations [1]. These degradations manifest in numerous forms: cameras suffer from lighting variations and obscurants, methods utilizing LiDARs, while robust to lighting, struggle with geometric self similarity and fog, radars can be hampered by multipath effects. Every exteroceptive sensing modality has its own unique set of degradations that may either cause it to fail or render methods that rely on it ineffective. Despite significant advances in sensor technology and algorithmic robustness, it does not seem feasible for a single perfect exteroceptive sensor to exist that can handle all environmental challenges. Due to this, the direction we turn to is the incorporation of multiple complementary modalities to achieve resiliency against perceptual degradation, allowing autonomous systems to maintain operational capacity even when individual sensing channels deteriorate.
This thesis presents 5 original contributions towards developing resiliency in perceptually degraded conditions. The contributions are ordered chronologically. Chapter 2 presents the robotic system-of-systems approach of Team Cerberus which won the DARPA Subterranean Challenge Final Event in 2021. The chapter explains in depth the development and deployment of the aerial, legged and wheeled robots that successfully navigated in unknown underground environments in the Final Event of the Challenge. It discusses in detail the perceptual challenges faced in such environments and sets the stage for possible solutions of sensor combinations that are applicable in such settings.
Chapter 3 presents a loosely coupled factor graph based approach for multi-modal fusion for resilient estimation. By leveraging the flexibility and extensibility of the factor graph data structure we are able to incorporate constraints from multiple odometry sources alongside pointclouds to provide an odometry that is resilient against degradation up to the point that there is atleast one functional modality. The presented method is evaluated on cases of geometric degeneracy and sensor dropout and in both cases is able to function reliably.
The method is developed further in Chapter 4 moving towards tight fusion of LiDAR, Radar and IMU. The complementarity of the LiDAR-Radar solution is demonstrated with the approach being able to function well in a long case of geometric degeneracy as well as extremely dense fog. The tight fusion also removes the requirement for having an explicit detection of degeneracy with the velocity information from the Radar naturally integrating into position information along the geometrically degenerate directions.
To enable the system in Chapter 4, Chapter 5 proposes a real-time low-cost synchronization module that is able to provide highly accurate triggering and timestamping of sensor data. In addition it is also able to synchronize the onboard computer with the sensor data with the IEEE1588 standard ensuring all operations on the robot have a common time axis.
Finally, Chapter 6 presents a method that demonstrates resourcefulness by better utilizing information available from the LiDAR. Particularly, the chapter presents an approach that leverages the intensity information available in modern LiDARs to generate constraints for aiding in geometrically degenerate environments.
Has parts
Paper 1: M. Tranzatto, M. Dharmadhikari, L. Bernreiter, M. Camurri, S. Khattak, F. Mascarich, P. Pfreundschuh, D. Wisth, S. Zimmermann, M. Kulkarni, V. Reijgwart, B. Casseau, T. Homberger, P. De Petris, L. Ott, W. Tubby, G. Waibel, H. Nguyen, C. Cadena, R. Buchanan, L. Wellhausen, N. Khedekar, O. Andersson, L. Zhang, T. Miki, T. Dang, M. Mattamala, M. Montenegro, K. Meyer, X. Wu, A. Briod, M.W. Mueller, M.F. Fallon, R.Y. Siegwart, M. Hutter, K. Alexis, “Team CERBERUS wins the darpa subterranean challenge: Technical overview and lessons learned,” Field Robotics, Vol. 4, 2024, Published by IEEE. This is an open-access article distributed under the terms of the Creative Commons Attribution License. Available at: https://doi:org/10:55417/fr:2024009 This paper is presenred as Chapter 2 in the thesis.Paper 2: Khedekar, Nikhil; Kulkarni, Mihir; Alexis, Konstantinos. MIMOSA: A Multi-Modal SLAM Framework for Resilient Autonomy against Sensor Degradation. I: 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE conference proceedings 2022 ISBN 978-1-6654-7927-1. s. 7153-7159. Copyright ©2022 IEEE. Available at: http://dx.doi.org/10.1109/IROS47612.2022.9981108. This paper is presented as Chapter 3 in the thesis.
Paper 3: Nissov, Morten Christian; Khedekar, Nikhil Vijay; Alexis, Konstantinos. Degradation Resilient LiDAR-Radar-Inertial Odometry. IEEE International Conference on Robotics and Automation (ICRA) 2024 s. 8587-8594. Copyright ©2024 IEEE. Available at: http://dx.doi.org/10.1109/ICRA57147.2024.10611444. Thia paper is presented as Chapter 4 in the thesis.
Paper 4: Nissov, Morten; Khedekar, Nikhil Vijay; Alexis, Konstantinos. Simultaneous Triggering and Synchronization of Sensors and Onboard Computers. ICRA@40; 2024-09-23 - 2024-09-26. Published by IEEE. This paper is presented as Chapter 5 in the thesis.
Paper 5: N. Khedekar and K. Alexis, “Pg-lio:photometric-geometric fusion for robust lidar-inertial odometry,” in Submitted to IEEE Robotics and Automation Letters, 2025. This paper is presented as Chapter 6 in the thesis. It is submitted for publication and is therefore not included.