2019 IEEE International Conference on Real-time Computing and Robotics (RCAR) | 2019

Extrinsic Calibration of 3D Range Finder and Camera without Auxiliary Object or Human Intervention

 
 
 
 

Abstract


Fusion of heterogeneous exteroceptive sensors is the most efficient and effective way to represent the environment precisely, as it overcomes various defects of each homogeneous sensor. The rigid transformation (aka. extrinsic parameters) of heterogeneous sensory systems should be available before precisely fusing the multisensor information. Researchers have proposed several approaches to estimate the extrinsic parameters. However, these approaches require either auxiliary objects, like chessboards, or extra help from human to select correspondences. In this paper, we propose a novel extrinsic calibration approach for the extrinsic calibration of range and image sensors which has no requirement of auxiliary objects or any human intervention. In this paper, we firstly estimate the initial extrinsic parameters from the individual motion of the 3D range finder and the camera. And then extract lines in the image and point-cloud pairs and match lines by the initial extrinsic parameters. Finally, we refine the extrinsic parameters by line feature associations.

Volume None
Pages 42-47
DOI 10.1109/RCAR47638.2019.9044146
Language English
Journal 2019 IEEE International Conference on Real-time Computing and Robotics (RCAR)

Full Text