UniCalib: Targetless LiDAR-camera Calibration via Probabilistic Flow on Unified Depth Representations
Shu Han · Xubo Zhu · Ji Wu · Ximeng Cai · Wen Yang · Huai Yu · Gui-Song Xia
Abstract
Online targetless extrinsic LiDAR-camera calibration is essential for robust perception in computer vision applications such as autonomous driving. However, existing methods struggle with the significant modality gap between heterogeneous sensors and fail to handle unreliable correspondences arising from real-world challenges like occlusions and dynamic objects. To address these issues, we introduce UniCalib, a novel method that performs calibration by estimating a probabilistic flow on unified depth representations. UniCalib first bridges the modality gap by converting both the camera images and the sparse LiDAR points into unified, dense depth maps, enabling a unified encoder to learn consistent features. Subsequently, it learns a probabilistic flow field that captures the correspondence uncertainty to improve robustness. This probabilistic approach is reinforced by a reliability map and a perceptually weighted sparse flow loss, which guide the model to suppress the influence of unreliable regions. Experimental results on three datasets validate the accuracy and generalization of UniCalib. In particular, it achieves a mean translation error of $0.550\mathrm{cm}$ and a rotation error of $0.044^\circ$ on the KITTI dataset.
Successful Page Load