Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hugo Furtado is active.

Publication


Featured researches published by Hugo Furtado.


Radiotherapy and Oncology | 2012

Monitoring tumor motion by real time 2D/3D registration during radiotherapy

Christelle Gendrin; Hugo Furtado; Christoph Weber; Christoph Bloch; Michael Figl; Supriyanto Ardjo Pawiro; Helmar Bergmann; M. Stock; Gabor Fichtinger; Dietmar Georg; Wolfgang Birkfellner

Background and purpose In this paper, we investigate the possibility to use X-ray based real time 2D/3D registration for non-invasive tumor motion monitoring during radiotherapy. Materials and methods The 2D/3D registration scheme is implemented using general purpose computation on graphics hardware (GPGPU) programming techniques and several algorithmic refinements in the registration process. Validation is conducted off-line using a phantom and five clinical patient data sets. The registration is performed on a region of interest (ROI) centered around the planned target volume (PTV). Results The phantom motion is measured with an rms error of 2.56 mm. For the patient data sets, a sinusoidal movement that clearly correlates to the breathing cycle is shown. Videos show a good match between X-ray and digitally reconstructed radiographs (DRR) displacement. Mean registration time is 0.5 s. Conclusions We have demonstrated that real-time organ motion monitoring using image based markerless registration is feasible.


Archive | 2010

Augmented Reality for Minimally Invasive Surgery: Overview and Some Recent Advances

Pablo Lamata; Wajid Ali; Alicia M. Cano; Jordi Cornella; Jerome Declerck; Ole Jakob Elle; Adinda Freudenthal; Hugo Furtado; Denis Kalkofen; Edvard Naerum; Eigil Samset; Patricia Sánchez-González; Francisco M. Sánchez-Margallo; Dieter Schmalstieg; Mauro Sette; Thomas Stüdeli; Jos Vander Sloten; Enrique J. Gómez

Pablo Lamata1,2, Wajid Ali3, Alicia Cano1, Jordi Cornella3, Jerome Declerck2, Ole J. Elle3, Adinda Freudenthal4, Hugo Furtado5, Denis Kalkofen6, Edvard Naerum3, Eigil Samset3, Patricia Sanchez-Gonzalez1, Francisco M. Sanchez-Margallo7, Dieter Schmalstieg6, Mauro Sette8, Thomas Studeli4, Jos Vander Sloten8 and Enrique J. Gomez1 1Universidad Politecnica de Madrid, Spain 2Siemens, United Kingdom 3University of Oslo, Norway 4Delft University of Technology, Netherlands 5Medical Centre Ljubljana, Slovenia 6 Graz University of Technology, Austria 7Minimally Invasive Surgery Centre Jesus Uson, Spain 8University of Leuven, Belgium


Zeitschrift Fur Medizinische Physik | 2012

High-performance GPU-based rendering for real-time, rigid 2D/3D-image registration and motion prediction in radiation oncology

Jakob Spoerk; Christelle Gendrin; Christoph Weber; Michael Figl; Supriyanto Ardjo Pawiro; Hugo Furtado; Daniella Fabri; Christoph Bloch; Helmar Bergmann; Eduard Gröller; Wolfgang Birkfellner

A common problem in image-guided radiation therapy (IGRT) of lung cancer as well as other malignant diseases is the compensation of periodic and aperiodic motion during dose delivery. Modern systems for image-guided radiation oncology allow for the acquisition of cone-beam computed tomography data in the treatment room as well as the acquisition of planar radiographs during the treatment. A mid-term research goal is the compensation of tumor target volume motion by 2D/3D Registration. In 2D/3D registration, spatial information on organ location is derived by an iterative comparison of perspective volume renderings, so-called digitally rendered radiographs (DRR) from computed tomography volume data, and planar reference x-rays. Currently, this rendering process is very time consuming, and real-time registration, which should at least provide data on organ position in less than a second, has not come into existence. We present two GPU-based rendering algorithms which generate a DRR of 512×512 pixels size from a CT dataset of 53 MB size at a pace of almost 100 Hz. This rendering rate is feasible by applying a number of algorithmic simplifications which range from alternative volume-driven rendering approaches - namely so-called wobbled splatting - to sub-sampling of the DRR-image by means of specialized raycasting techniques. Furthermore, general purpose graphics processing unit (GPGPU) programming paradigms were consequently utilized. Rendering quality and performance as well as the influence on the quality and performance of the overall registration process were measured and analyzed in detail. The results show that both methods are competitive and pave the way for fast motion compensation by rigid and possibly even non-rigid 2D/3D registration and, beyond that, adaptive filtering of motion models in IGRT.


Acta Oncologica | 2013

Real-time 2D/3D registration using kV-MV image pairs for tumor motion tracking in image guided radiotherapy

Hugo Furtado; Elisabeth Steiner; M. Stock; Dietmar Georg; Wolfgang Birkfellner

Abstract Intra-fractional respiratory motion during radiotherapy leads to a larger planning target volume (PTV). Real-time tumor motion tracking by two-dimensional (2D)/3D registration using on-board kilo-voltage (kV) imaging can allow for a reduction of the PTV though motion along the imaging beam axis cannot be resolved using only one projection image. We present a retrospective patient study investigating the impact of paired portal mega-voltage (MV) and kV images on registration accuracy. Material and methods. We used data from 10 patients suffering from non-small cell lung cancer (NSCLC) undergoing stereotactic body radiation therapy (SBRT) lung treatment. For each patient we acquired a planning computed tomography (CT) and sequences of kV and MV images during treatment. We compared the accuracy of motion tracking in six degrees-of-freedom (DOF) using the anterior-posterior (AP) kV sequence or the sequence of kV-MV image pairs. Results. Motion along cranial-caudal direction could accurately be extracted when using only the kV sequence but in AP direction we obtained large errors. When using kV-MV pairs, the average error was reduced from 2.9 mm to 1.5 mm and the motion along AP was successfully extracted. Mean registration time was 188 ms. Conclusion. Our evaluation shows that using kV-MV image pairs leads to improved motion extraction in six DOF and is suitable for real-time tumor motion tracking with a conventional LINAC.


Journal of Radiation Research | 2013

Performance validation of deformable image registration in the pelvic region

V. Zambrano; Hugo Furtado; Daniella Fabri; C. Lütgendorf-Caucig; Joanna Góra; M. Stock; Ramona Mayer; Wolfgang Birkfellner; Dietmar Georg

Patients undergoing radiotherapy will inevitably show anatomical changes during the course of treatment. These can be weight loss, tumour shrinkage, and organ motion or filling changes. For advanced and adaptive radiotherapy (ART) information about anatomical changes must be extracted from repeated images in order to be able to evaluate and manage these changes. Deformable image registration (DIR) is a tool that can be used to efficiently gather information about anatomical changes. The aim of the present study was to evaluate the performance of two DIR methods for automatic organ at risk (OAR) contour propagation. Datasets from ten gynaecological patients having repeated computed tomography (CT) and cone beam computed tomography (CBCT) scans were collected. Contours were delineated on the planning CT and on every repeated scan by an expert clinician. DIR using our in-house developed featurelet-based method and the iPlan® BrainLab treatment planning system software was performed with the planning CT as reference and a selection of repeated scans as the target dataset. The planning CT contours were deformed using the resulting deformation fields and compared to the manually defined contours. Dices similarity coefficients (DSCs) were calculated for each fractional patient scan structure, comparing the volume overlap using DIR with that using rigid registration only. No significant improvement in volume overlap was found after DIR as compared with rigid registration, independent of which image modality or DIR method was used. DIR needs to be further improved in order to facilitate contour propagation in the pelvic region in ART approaches.


Technology in Cancer Research & Treatment | 2015

Clinical Assessment of 2D/3D Registration Accuracy in 4 Major Anatomic Sites Using On-Board 2D Kilovoltage Images for 6D Patient Setup

Guang Li; T. Jonathan Yang; Hugo Furtado; Wolfgang Birkfellner; Simon N. Powell; James Mechalakos

To provide a comprehensive assessment of patient setup accuracy in 6 degrees of freedom (DOFs) using 2-dimensional/3-dimensional (2D/3D) image registration with on-board 2-dimensional kilovoltage (OB-2DkV) radiographic images, we evaluated cranial, head and neck (HN), and thoracic and abdominal sites under clinical conditions. A fast 2D/3D image registration method using graphics processing unit GPU was modified for registration between OB-2DkV and 3D simulation computed tomography (simCT) images, with 3D/3D registration as the gold standard for 6DOF alignment. In 2D/3D registration, body roll rotation was obtained solely by matching orthogonal OB-2DkV images with a series of digitally reconstructed radiographs (DRRs) from simCT with a small rotational increment along the gantry rotation axis. The window/level adjustments for optimal visualization of the bone in OB-2DkV and DRRs were performed prior to registration. Ideal patient alignment at the isocenter was calculated and used as an initial registration position. In 3D/3D registration, cone-beam CT (CBCT) was aligned to simCT on bony structures using a bone density filter in 6DOF. Included in this retrospective study were 37 patients treated in 55 fractions with frameless stereotactic radiosurgery or stereotactic body radiotherapy for cranial and paraspinal cancer. A cranial phantom was used to serve as a control. In all cases, CBCT images were acquired for patient setup with subsequent OB-2DkV verification. It was found that the accuracy of the 2D/3D registration was 0.0 ± 0.5 mm and 0.1° ± 0.4° in phantom. In patient, it is site dependent due to deformation of the anatomy: 0.2 ± 1.6 mm and −0.4° ± 1.2° on average for each dimension for the cranial site, 0.7 ± 1.6 mm and 0.3° ± 1.3° for HN, 0.7 ± 2.0 mm and −0.7° ± 1.1° for the thorax, and 1.1 ± 2.6 mm and −0.5° ± 1.9° for the abdomen. Anatomical deformation and presence of soft tissue in 2D/3D registration affect the consistency with 3D/3D registration in 6DOF: the discrepancy increases in superior to inferior direction.


Zeitschrift Fur Medizinische Physik | 2013

A quantitative comparison of the performance of three deformable registration algorithms in radiotherapy

Daniella Fabri; Valentina Zambrano; Amon Bhatia; Hugo Furtado; Helmar Bergmann; M. Stock; Christoph Bloch; C. Lütgendorf-Caucig; Supriyanto Ardjo Pawiro; Dietmar Georg; Wolfgang Birkfellner; Michael Figl

We present an evaluation of various non-rigid registration algorithms for the purpose of compensating interfractional motion of the target volume and organs at risk areas when acquiring CBCT image data prior to irradiation. Three different deformable registration (DR) methods were used: the Demons algorithm implemented in the iPlan Software (BrainLAB AG, Feldkirchen, Germany) and two custom-developed piecewise methods using either a Normalized Correlation or a Mutual Information metric (featureletNC and featureletMI). These methods were tested on data acquired using a novel purpose-built phantom for deformable registration and clinical CT/CBCT data of prostate and lung cancer patients. The Dice similarity coefficient (DSC) between manually drawn contours and the contours generated by a derived deformation field of the structures in question was compared to the result obtained with rigid registration (RR). For the phantom, the piecewise methods were slightly superior, the featureletNC for the intramodality and the featureletMI for the intermodality registrations. For the prostate cases in less than 50% of the images studied the DSC was improved over RR. Deformable registration methods improved the outcome over a rigid registration for lung cases and in the phantom study, but not in a significant way for the prostate study. A significantly superior deformation method could not be identified.


Proceedings of SPIE | 2012

Real-time 2D/3D registration for tumor motion tracking during radiotherapy

Hugo Furtado; Christelle Gendrin; Christoph Bloch; Jakob Spoerk; Supriyanto Ardjo Pawiro; Christoph Weber; Michael Figl; M. Stock; Dietmar Georg; Helmar Bergmann; Wolfgang Birkfellner

Organ motion during radiotherapy is one of causes of uncertainty in dose delivery. To cope with this, the planned target volume (PTV) has to be larger than needed to guarantee full tumor irradiation. Existing methods deal with the problem by performing tumor tracking using implanted fiducial markers or magnetic sensors. In this work, we investigate the feasibility of using x-ray based real time 2D/3D registration for non-invasive tumor motion tracking during radiotherapy. Our method uses purely intensity based techniques, thus avoiding markers or fiducials. X-rays are acquired during treatment at a rate of 5.4Hz. We iteratively compare each x-ray with a set of digitally reconstructed radiographs (DRR) generated from the planning volume dataset, finding the optimal match between the x-ray and one of the DRRs. The DRRs are generated using a ray-casting algorithm, implemented using general purpose computation on graphics hardware (GPGPU) programming techniques using CUDA for greater performance. Validation is conducted off-line using a phantom and five clinical patient data sets. The registration is performed on a region of interest (ROI) centered around the PTV. The phantom motion is measured with an rms error of 2.1 mm and mean registration time is 220 ms. For the patient data sets, a sinusoidal movement that clearly correlates to the breathing cycle is seen. Mean registration time is always under 105 ms which is well suited for our purposes. These results demonstrate that real-time organ motion monitoring using image based markerless registration is feasible.


Proceedings of SPIE | 2016

FIRE: an open-software suite for real-time 2D/3D image registration for image guided radiotherapy research

Hugo Furtado; Christelle Gendrin; Jakob Spoerk; Elisabeth Steiner; Tracy S. A. Underwood; Thomas Kuenzler; Dietmar Georg; Wolfgang Birkfellner

Radiotherapy treatments have changed at a tremendously rapid pace. Dose delivered to the tumor has escalated while organs at risk (OARs) are better spared. The impact of moving tumors during dose delivery has become higher due to very steep dose gradients. Intra-fractional tumor motion has to be managed adequately to reduce errors in dose delivery. For tumors with large motion such as tumors in the lung, tracking is an approach that can reduce position uncertainty. Tumor tracking approaches range from purely image intensity based techniques to motion estimation based on surrogate tracking. Research efforts are often based on custom designed software platforms which take too much time and effort to develop. To address this challenge we have developed an open software platform especially focusing on tumor motion management. FLIRT is a freely available open-source software platform. The core method for tumor tracking is purely intensity based 2D/3D registration. The platform is written in C++ using the Qt framework for the user interface. The performance critical methods are implemented on the graphics processor using the CUDA extension. One registration can be as fast as 90ms (11Hz). This is suitable to track tumors moving due to respiration (~0.3Hz) or heartbeat (~1Hz). Apart from focusing on high performance, the platform is designed to be flexible and easy to use. Current use cases range from tracking feasibility studies, patient positioning and method validation. Such a framework has the potential of enabling the research community to rapidly perform patient studies or try new methods.


Physica Medica | 2018

Clinical implementations of 4D pencil beam scanned particle therapy: Report on the 4D treatment planning workshop 2016 and 2017

Petra Trnková; B. Knäusl; Oxana Actis; Christoph Bert; A. Biegun; Till T. Boehlen; Hugo Furtado; Jamie R. McClelland; Shinichiro Mori; Ilaria Rinaldi; Antoni Rucinski; Antje C. Knopf

In 2016 and 2017, the 8th and 9th 4D treatment planning workshop took place in Groningen (the Netherlands) and Vienna (Austria), respectively. This annual workshop brings together international experts to discuss research, advances in clinical implementation as well as problems and challenges in 4D treatment planning, mainly in spot scanned proton therapy. In the last two years several aspects like treatment planning, beam delivery, Monte Carlo simulations, motion modeling and monitoring, QA phantoms as well as 4D imaging were thoroughly discussed. This report provides an overview of discussed topics, recent findings and literature review from the last two years. Its main focus is to highlight translation of 4D research into clinical practice and to discuss remaining challenges and pitfalls that still need to be addressed and to be overcome.

Collaboration


Dive into the Hugo Furtado's collaboration.

Top Co-Authors

Avatar

Dietmar Georg

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Wolfgang Birkfellner

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

M. Stock

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Elisabeth Steiner

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Michael Figl

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Christelle Gendrin

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Christoph Bloch

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Helmar Bergmann

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Christoph Weber

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar

Jakob Spoerk

Medical University of Vienna

View shared research outputs
Researchain Logo
Decentralizing Knowledge