Proceedings of SPIE | 2021

2D/3D deep registration for real-time prostate biopsy navigation

 
 
 
 

Abstract


The accuracy of biopsy sampling and the related tumor localization are major issues for prostate cancer diagnosis and therapy. However, the ability to navigate accurately to biopsy targets faces several difficulties coming from both transrectal ultrasound (TRUS) image guidance properties and prostate motion or deformation. To reduce inaccuracy and exam duration, the main objective of this study is to develop a real-time navigation assistance. The aim is to provide the current probe position and orientation with respect to the deformable organ and the next biopsy targets. We propose a deep learning real-time 2D/3D registration method based on Convolutional Neural Networks (CNN) to localize the current 2D US image relative to the available 3D TRUS reference volume. We experiment several scenarii combining different input data including: pair of successive 2D US images, the optical flow between them and current probe tracking information. The main novelty of our study is to consider prior navigation trajectory information by introducing previous registration result. This model is evaluated on clinical data through simulated biopsy trajectories. The results highlight significant improvement by exploiting trajectory information especially through prior registration results and probe tracking parameters. With such trajectory information, we achieve an average registration error of 2.21 mm ± 2.89. The network demonstrates efficient generalization capabilities on new patients and new trajectories, which is promising for successful continuous tracking during biopsy procedure.

Volume 11598
Pages None
DOI 10.1117/12.2579874
Language English
Journal Proceedings of SPIE

Full Text