International journal of radiation oncology, biology, physics | 2021

Deep Learning Based Lymph Node Gross Tumor Volume Detection via Distance-Guided Gating Using CT and 18F-FDG PET in Esophageal Cancer Radiotherapy.

 
 
 
 
 
 
 
 
 
 
 

Abstract


PURPOSE/OBJECTIVE(S)\nIdentifying suspicious cancer metastasized lymph nodes (GTV is an essential task in esophageal cancer radiotherapy. Manual delineation requires high-level sophisticated clinical reasoning in multi-modality imaging, which is challenging and suffers from high subjectivity and inter-observer variability. We propose an automated esophageal GTV_LN detection method via integrating a distance-based decision stratification into a multi-task deep network, which is able to effectively learn the GTV_LN oncology features. This is inspired by the fact that lymph node involvements often follow certain distance patterns from the primary tumors.\n\n\nMATERIALS/METHODS\nWe collected and curated 2 esophageal cancer datasets with PET and RTCT images. The 1st dataset contained 141 patients and 651 annotated GTV_LN for training and evaluating our deep network (60%, 10% and 30% for training, validation and testing), while the 2nd dataset contained 10 patients aiming for the multi-user agreement analysis. A distance-based decision stratification is designed to divide GTV_LN into tumor-proximal and tumor-distal categories and a multi-branch deep network is proposed to solve each of them. The multi-branch network has a shared encoder and separate decoders to detect and segment two GTV_LN subcategories, respectively. The distance-based decision stratification enables the automated multi-branch method to specialize in GTV_LN oncology features of each subcategory. We use both PET and RTCT as the network inputs to exploit the complementary information in each imaging modality.\n\n\nRESULTS\nIn the testing set of the 1st esophageal cancer dataset (34 patients with 138 GTV_LNs), our method achieves a sensitivity of 73.8% and 79.1% at 6 and 12 false positives (FPs) per patient as compared to 72.0% and 75.9% by MULAN, the state-of-the-art deep learning based universal lesion detection algorithm. On the 2nd dataset containing 10 patients, 3 radiation oncologists annotated 53 GTV_LNs in total, among which 42 GTV_LNs were agreed with all 3 physicians leading to 79.2% multi-user agreement. Our method achieves a sensitivity of 77.4% at 6 FPs per patient on the 2nd dataset. This shows that the sensitivity achieved by our automated deep learning approach can be close to the multi-user agreement.\n\n\nCONCLUSION\nWe proposed a multi-task deep learning method with distance-based decision stratification to effectively detect and segment the GTV_LN in esophageal cancer radiotherapy. It significantly improves on previous state-of-the-art and the achieved performance has potential clinical values considering the large multi-user variability in this challenging task.

Volume 111 3S
Pages \n e87-e88\n
DOI 10.1016/j.ijrobp.2021.07.464
Language English
Journal International journal of radiation oncology, biology, physics

Full Text