Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups Articles uri icon

publication date

  • October 2022

start page

  • 17677

end page

  • 17689

issue

  • 10

volume

  • 23

International Standard Serial Number (ISSN)

  • 1558-0016

abstract

  • Most sensor setups for onboard autonomous perception are composed of LiDARs and vision systems, as they provide complementary information that improves the reliability of the different algorithms necessary to obtain a robust scene understanding. However, the effective use of information from different sources requires an accurate calibration between the sensors involved, which usually implies a tedious and burdensome process. We present a method to calibrate the extrinsic parameters of any pair of sensors involving LiDARs, monocular or stereo cameras, of the same or different modalities. The procedure is composed of two stages: first, reference points belonging to a custom calibration target are extracted from the data provided by the sensors to be calibrated, and second, the optimal rigid transformation is found through the registration of both point sets. The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups. In order to assess the performance of the proposed method, a novel evaluation suite built on top of a popular simulation framework is introduced. Experiments on the synthetic environment show that our calibration algorithm significantly outperforms existing methods, whereas real data tests corroborate the results obtained in the evaluation suite. Open-source code is available at https://github.com/beltransen/velo2cam_calibration.

subjects

  • Robotics and Industrial Informatics

keywords

  • automatic calibration; extrinsic parameters; lidar; monocular cameras; stereo cameras