2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN) | 2021

A Sensor Fusion System with Thermal Infrared Camera and LiDAR for Autonomous Vehicles: Its Calibration and Application

 
 

Abstract


Vision, Radar, and LiDAR sensors are widely used for autonomous vehicle perception technology. Especially object detection and classification are primarily dependent on vision sensors. However, under poor lighting conditions, dazzling sunlight, or bad weathers an object might be difficult to be identified with general vision sensors. In this paper, we propose a sensor fusion system with a thermal infrared camera and LiDAR sensor that can reliably detect and identify objects even in environments where visibility is poor, such as in severe glare and fog or smoke. The proposed method obtains intrinsic parameters by calibrating the thermal infrared camera and LiDAR sensor. Extrinsic calibration algorithm between two sensors is made to obtain the extrinsic parameters (rotation and translation matrix) using 3D calibration targets. This system and proposed algorithm show that it can reliably detect and identify objects even in hard visibility environments, such as in severe glare due to direct sunlight or headlights or in low visibility environments, such as in severe fog or smoke.

Volume None
Pages 361-365
DOI 10.1109/ICUFN49451.2021.9528609
Language English
Journal 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN)

Full Text