Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christoph Bloch is active.

Publication


Featured researches published by Christoph Bloch.


Radiotherapy and Oncology | 2012

Monitoring tumor motion by real time 2D/3D registration during radiotherapy

Christelle Gendrin; Hugo Furtado; Christoph Weber; Christoph Bloch; Michael Figl; Supriyanto Ardjo Pawiro; Helmar Bergmann; M. Stock; Gabor Fichtinger; Dietmar Georg; Wolfgang Birkfellner

Background and purpose In this paper, we investigate the possibility to use X-ray based real time 2D/3D registration for non-invasive tumor motion monitoring during radiotherapy. Materials and methods The 2D/3D registration scheme is implemented using general purpose computation on graphics hardware (GPGPU) programming techniques and several algorithmic refinements in the registration process. Validation is conducted off-line using a phantom and five clinical patient data sets. The registration is performed on a region of interest (ROI) centered around the planned target volume (PTV). Results The phantom motion is measured with an rms error of 2.56 mm. For the patient data sets, a sinusoidal movement that clearly correlates to the breathing cycle is shown. Videos show a good match between X-ray and digitally reconstructed radiographs (DRR) displacement. Mean registration time is 0.5 s. Conclusions We have demonstrated that real-time organ motion monitoring using image based markerless registration is feasible.


Medical Physics | 2011

Validation for 2D/3D registration II: The comparison of intensity- and gradient-based merit functions using a new gold standard data set

Christelle Gendrin; Primož Markelj; Supriyanto Ardjo Pawiro; Jakob Spoerk; Christoph Bloch; Christoph Weber; Michael Figl; Helmar Bergmann; Wolfgang Birkfellner; Boštjan Likar; Franjo Pernuš

PURPOSE A new gold standard data set for validation of 2D/3D registration based on a porcine cadaver head with attached fiducial markers was presented in the first part of this article. The advantage of this new phantom is the large amount of soft tissue, which simulates realistic conditions for registration. This article tests the performance of intensity- and gradient-based algorithms for 2D/3D registration using the new phantom data set. METHODS Intensity-based methods with four merit functions, namely, cross correlation, rank correlation, correlation ratio, and mutual information (MI), and two gradient-based algorithms, the backprojection gradient-based (BGB) registration method and the reconstruction gradient-based (RGB) registration method, were compared. Four volumes consisting of CBCT with two fields of view, 64 slice multidetector CT, and magnetic resonance-T1 weighted images were registered to a pair of kV x-ray images and a pair of MV images. A standardized evaluation methodology was employed. Targets were evenly spread over the volumes and 250 starting positions of the 3D volumes with initial displacements of up to 25 mm from the gold standard position were calculated. After the registration, the displacement from the gold standard was retrieved and the root mean square (RMS), mean, and standard deviation mean target registration errors (mTREs) over 250 registrations were derived. Additionally, the following merit properties were computed: Accuracy, capture range, number of minima, risk of nonconvergence, and distinctiveness of optimum for better comparison of the robustness of each merit. RESULTS Among the merit functions used for the intensity-based method, MI reached the best accuracy with an RMS mTRE down to 1.30 mm. Furthermore, it was the only merit function that could accurately register the CT to the kV x rays with the presence of tissue deformation. As for the gradient-based methods, BGB and RGB methods achieved subvoxel accuracy (RMS mTRE down to 0.56 and 0.70 mm, respectively). Overall, gradient-based similarity measures were found to be substantially more accurate than intensity-based methods and could cope with soft tissue deformation and enabled also accurate registrations of the MR-T1 volume to the kV x-ray image. CONCLUSIONS In this article, the authors demonstrate the usefulness of a new phantom image data set for the evaluation of 2D/3D registration methods, which featured soft tissue deformation. The authors evaluation shows that gradient-based methods are more accurate than intensity-based methods, especially when soft tissue deformation is present. However, the current nonoptimized implementations make them prohibitively slow for practical applications. On the other hand, the speed of the intensity-based method renders these more suitable for clinical use, while the accuracy is still competitive.


Zeitschrift Fur Medizinische Physik | 2012

High-performance GPU-based rendering for real-time, rigid 2D/3D-image registration and motion prediction in radiation oncology

Jakob Spoerk; Christelle Gendrin; Christoph Weber; Michael Figl; Supriyanto Ardjo Pawiro; Hugo Furtado; Daniella Fabri; Christoph Bloch; Helmar Bergmann; Eduard Gröller; Wolfgang Birkfellner

A common problem in image-guided radiation therapy (IGRT) of lung cancer as well as other malignant diseases is the compensation of periodic and aperiodic motion during dose delivery. Modern systems for image-guided radiation oncology allow for the acquisition of cone-beam computed tomography data in the treatment room as well as the acquisition of planar radiographs during the treatment. A mid-term research goal is the compensation of tumor target volume motion by 2D/3D Registration. In 2D/3D registration, spatial information on organ location is derived by an iterative comparison of perspective volume renderings, so-called digitally rendered radiographs (DRR) from computed tomography volume data, and planar reference x-rays. Currently, this rendering process is very time consuming, and real-time registration, which should at least provide data on organ position in less than a second, has not come into existence. We present two GPU-based rendering algorithms which generate a DRR of 512×512 pixels size from a CT dataset of 53 MB size at a pace of almost 100 Hz. This rendering rate is feasible by applying a number of algorithmic simplifications which range from alternative volume-driven rendering approaches - namely so-called wobbled splatting - to sub-sampling of the DRR-image by means of specialized raycasting techniques. Furthermore, general purpose graphics processing unit (GPGPU) programming paradigms were consequently utilized. Rendering quality and performance as well as the influence on the quality and performance of the overall registration process were measured and analyzed in detail. The results show that both methods are competitive and pave the way for fast motion compensation by rigid and possibly even non-rigid 2D/3D registration and, beyond that, adaptive filtering of motion models in IGRT.


Zeitschrift Fur Medizinische Physik | 2013

A quantitative comparison of the performance of three deformable registration algorithms in radiotherapy

Daniella Fabri; Valentina Zambrano; Amon Bhatia; Hugo Furtado; Helmar Bergmann; M. Stock; Christoph Bloch; C. Lütgendorf-Caucig; Supriyanto Ardjo Pawiro; Dietmar Georg; Wolfgang Birkfellner; Michael Figl

We present an evaluation of various non-rigid registration algorithms for the purpose of compensating interfractional motion of the target volume and organs at risk areas when acquiring CBCT image data prior to irradiation. Three different deformable registration (DR) methods were used: the Demons algorithm implemented in the iPlan Software (BrainLAB AG, Feldkirchen, Germany) and two custom-developed piecewise methods using either a Normalized Correlation or a Mutual Information metric (featureletNC and featureletMI). These methods were tested on data acquired using a novel purpose-built phantom for deformable registration and clinical CT/CBCT data of prostate and lung cancer patients. The Dice similarity coefficient (DSC) between manually drawn contours and the contours generated by a derived deformation field of the structures in question was compared to the result obtained with rigid registration (RR). For the phantom, the piecewise methods were slightly superior, the featureletNC for the intramodality and the featureletMI for the intermodality registrations. For the prostate cases in less than 50% of the images studied the DSC was improved over RR. Deformable registration methods improved the outcome over a rigid registration for lung cases and in the phantom study, but not in a significant way for the prostate study. A significantly superior deformation method could not be identified.


Physics in Medicine and Biology | 2010

Efficient implementation of the rank correlation merit function for 2D/3D registration

Michael Figl; Christoph Bloch; Christelle Gendrin; Christoph Weber; Supriyanto Ardjo Pawiro; Johann Hummel; Primož Markelj; Franjo Pernuš; Helmar Bergmann; Wolfgang Birkfellner

A growing number of clinical applications using 2D/3D registration have been presented recently. Usually, a digitally reconstructed radiograph is compared iteratively to an x-ray image of the known projection geometry until a match is achieved, thus providing six degrees of freedom of rigid motion which can be used for patient setup in image-guided radiation therapy or computer-assisted interventions. Recently, stochastic rank correlation, a merit function based on Spearmans rank correlation coefficient, was presented as a merit function especially suitable for 2D/3D registration. The advantage of this measure is its robustness against variations in image histogram content and its wide convergence range. The considerable computational expense of computing an ordered rank list is avoided here by comparing randomly chosen subsets of the DRR and reference x-ray. In this work, we show that it is possible to omit the sorting step and to compute the rank correlation coefficient of the full image content as fast as conventional merit functions. Our evaluation of a well-calibrated cadaver phantom also confirms that rank correlation-type merit functions give the most accurate results if large differences in the histogram content for the DRR and the x-ray image are present.


Proceedings of SPIE | 2010

A new gold-standard dataset for 2D/3D image registration evaluation

Supriyanto Ardjo Pawiro; Primoz Markelj; Christelle Gendrin; Michael Figl; M. Stock; Christoph Bloch; Christoph Weber; Ewald Unger; Iris Nöbauer; Franz Kainberger; Helga Bergmeister; Dietmar Georg; Helmar Bergmann; Wolfgang Birkfellner

In this paper, we propose a new gold standard data set for the validation of 2D/3D image registration algorithms for image guided radiotherapy. A gold standard data set was calculated using a pig head with attached fiducial markers. We used several imaging modalities common in diagnostic imaging or radiotherapy which include 64-slice computed tomography (CT), magnetic resonance imaging (MRI) using T1, T2 and proton density (PD) sequences, and cone beam CT (CBCT) imaging data. Radiographic data were acquired using kilovoltage (kV) and megavoltage (MV) imaging techniques. The image information reflects both anatomy and reliable fiducial marker information, and improves over existing data sets by the level of anatomical detail and image data quality. The markers of three dimensional (3D) and two dimensional (2D) images were segmented using Analyze 9.0 (AnalyzeDirect, Inc) and an in-house software. The projection distance errors (PDE) and the expected target registration errors (TRE) over all the image data sets were found to be less than 1.7 mm and 1.3 mm, respectively. The gold standard data set, obtained with state-of-the-art imaging technology, has the potential to improve the validation of 2D/3D registration algorithms for image guided therapy.


Proceedings of SPIE | 2012

Real-time 2D/3D registration for tumor motion tracking during radiotherapy

Hugo Furtado; Christelle Gendrin; Christoph Bloch; Jakob Spoerk; Supriyanto Ardjo Pawiro; Christoph Weber; Michael Figl; M. Stock; Dietmar Georg; Helmar Bergmann; Wolfgang Birkfellner

Organ motion during radiotherapy is one of causes of uncertainty in dose delivery. To cope with this, the planned target volume (PTV) has to be larger than needed to guarantee full tumor irradiation. Existing methods deal with the problem by performing tumor tracking using implanted fiducial markers or magnetic sensors. In this work, we investigate the feasibility of using x-ray based real time 2D/3D registration for non-invasive tumor motion tracking during radiotherapy. Our method uses purely intensity based techniques, thus avoiding markers or fiducials. X-rays are acquired during treatment at a rate of 5.4Hz. We iteratively compare each x-ray with a set of digitally reconstructed radiographs (DRR) generated from the planning volume dataset, finding the optimal match between the x-ray and one of the DRRs. The DRRs are generated using a ray-casting algorithm, implemented using general purpose computation on graphics hardware (GPGPU) programming techniques using CUDA for greater performance. Validation is conducted off-line using a phantom and five clinical patient data sets. The registration is performed on a region of interest (ROI) centered around the PTV. The phantom motion is measured with an rms error of 2.1 mm and mean registration time is 220 ms. For the patient data sets, a sinusoidal movement that clearly correlates to the breathing cycle is seen. Mean registration time is always under 105 ms which is well suited for our purposes. These results demonstrate that real-time organ motion monitoring using image based markerless registration is feasible.


Proceedings of SPIE | 2010

Towards real-time 2D/3D registration for organ motion monitoring in image-guided radiation therapy

Christelle Gendrin; Jakob Spoerk; Christoph Bloch; Supriyanto Ardjo Pawiro; Christoph Weber; Michael Figl; Primoz Markelj; Franjo Pernuš; Dietmar Georg; Helmar Bergmann; Wolfgang Birkfellner

Nowadays, radiation therapy systems incorporate kV imaging units which allow for the real-time acquisition of intra-fractional X-ray images of the patient with high details and contrast. An application of this technology is tumor motion monitoring during irradiation. For tumor tracking, implanted markers or position sensors are used which requires an intervention. 2D/3D intensity based registration is an alternative, non-invasive method but the procedure must be accelerate to the update rate of the device, which lies in the range of 5 Hz. In this paper we investigate fast CT to a single kV X-ray 2D/3D image registration using a new porcine reference phantom with seven implanted fiducial markers. Several parameters influencing the speed and accuracy of the registrations are investigated. First, four intensity based merit functions, namely Cross-Correlation, Rank Correlation, Mutual Information and Correlation Ratio, are compared. Secondly, wobbled splatting and ray casting rendering techniques are implemented on the GPU and the influence of each algorithm on the performance of 2D/3D registration is evaluated. Rendering times for a single DRR of 20 ms were achieved. Different thresholds of the CT volume were also examined for rendering to find the setting that achieves the best possible correspondence with the X-ray images. Fast registrations below 4 s became possible with an inplane accuracy down to 0.8 mm.


Proceedings of SPIE | 2011

Comparison of two navigation system designs for flexible endoscopes using abdominal 3D ultrasound

Marcus Kaar; Rainer Hoffmann; Helmar Bergmann; Michael Figl; Christoph Bloch; Alfred Kratochwil; Wolfgang Birkfellner; Johann Hummel

This paper describes a navigation system for flexible endoscopes equipped with ultrasound scan heads. For navigation and needle biopsy procedures it provides additional oblique slices from preoperative computed tomography (CT) volumes which are displayed with the corresponding endoscopic ultrasound (US) image. In contrast to similar systems an additional abdominal 3D ultrasound image is used to achieve the required registration. Two different approaches are compared: one method is based on direct inter-modal registration between abdominal 3D ultrasound and CT volume. The second method uses another 3D US scan taken preoperatively before the CT scan. Here, the CT is calibrated by means of an optical tracking system and the transformation between CT and the calibrated 3D US can be calculated without image registration. Before intervention, a pre-interventional 3D US is registered intra-modal to the preoperative US. This second method invoked to be the more robust and accurate procedure. For experimental studies a phantom has been developed which consists of a plastic tube inside a water tank. For error evaluation small plastic spheres have been fixed around the tube at different distances. First results give an overall error of 3.9 mm for the first method while the overall error for the intramodal method amounted to 3.1 mm.


Bildverarbeitung für die Medizin | 2012

Real-Time Intensity Based 2D/3D Registration for Tumor Motion Tracking During Radiotherapy

Hugo Furtado; Christelle Gendrin; Christoph Bloch; Jakob Spoerk; Suprianto A. Pawiro; Christoph Weber; Michael Figl; Helmar Bergmann; M. Stock; Dietmar Georg; Wolfgang Birkfellner

Organ motion during radiotherapy is one of the causes of uncertainty in dose delivery creating the need to enlarge the planned target volume (PTV) to guarantee full tumor irradiation. In this work, we investigate the feasibility of using real-time 2D/3D registration for tumor motion tracking during radiotherapy based on purely intensity based image processing, thus avoiding markers or fiducials. X-rays are acquired during treatment at a rate of 5.4 Hz. We iteratively compare each x-ray with a set of digitally reconstructed radiographs (DRR) generated from the planning volume dataset, finding the optimal match between the xray and one of the DRRs. The DRRs are generated using a ray-casting algorithm, implemented using general purpose computation on graphics hardware (GPGPU) for best performance. Validation is conducted offline using a phantom and five clinical patient data sets. The phantom motion is measured with an RMS error of 2.1 mm and mean registration time is 220 ms. For the patient data sets, a sinusoidal movement that clearly correlates to the breathing cycle is seen. Mean registration time is always under 105 ms which is well suited for our purposes. These results demonstrate that real-time organ motion monitoring using image based markerless registration is feasible.

Collaboration


Dive into the Christoph Bloch's collaboration.

Top Co-Authors

Avatar

Michael Figl

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Wolfgang Birkfellner

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Helmar Bergmann

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Christelle Gendrin

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Christoph Weber

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dietmar Georg

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

M. Stock

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Hugo Furtado

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Jakob Spoerk

Medical University of Vienna

View shared research outputs
Researchain Logo
Decentralizing Knowledge