2019 IEEE International Conference on Unmanned Systems (ICUS) | 2019

Motion Parallax Estimation for Ultra Low Altitude Obstacle Avoidance

 
 
 
 

Abstract


By making full use of the curvature of the earth and ground clutter, the ultra-low altitude flight technique can significantly increase the difficulty of detecting and intercepting unmanned aerial vehicles (UAVs), and improve their penetration probabilities. However, unpredicted static/dynamic obstacles seriously threaten the flight safety of the ultra-low altitude UAVs. Therefore, fine autonomous spatial perception capability is essential for ultra-low altitude UAVs with limited loads and computing resources to avoid obstacles. To solve the difficult problem of spatial perception for ultra-low altitude obstacle avoidance, a motion parallax estimation algorithm is proposed by combing the deep learning and the epipolar geometry in this paper. By utilizing the deep neural network, the end-to-end mapping relationship from sequence images to depth values of pixels is constructed for spatial perception. At the same time, to overcome the interference caused by the image consistency hypothesis, the loss function is constructed based on the 3D geometric information of the scene rather than the re-projection error between images. Finally, the proposed algorithm is evaluated on the KITTI dataset and infrared image dataset to verify its effectiveness and generalization.

Volume None
Pages 463-468
DOI 10.1109/ICUS48101.2019.8995948
Language English
Journal 2019 IEEE International Conference on Unmanned Systems (ICUS)

Full Text