Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cristian A. Linte is active.

Publication


Featured researches published by Cristian A. Linte.


Computer Aided Surgery | 2008

Virtual reality-enhanced ultrasound guidance: a novel technique for intracardiac interventions.

Cristian A. Linte; John Moore; Andrew D. Wiles; Chris Wedlake; Terry M. Peters

Cardiopulmonary bypass surgery, although a highly invasive interventional approach leading to numerous complications, is still the most common therapy option for treating many forms of cardiac disease. We are currently engaged in a project designed to replace many bypass surgeries with less traumatic, minimally invasive intracardiac therapies. This project combines real-time intra-operative echocardiography with a virtual reality environment providing the surgeon with a broad range of valuable information. Pre-operative images, electrophysiological data, positions of magnetically tracked surgical instruments, and dynamic surgical target representations are among the data that can be presented to the surgeon to augment intra-operative ultrasound images. This augmented reality system is applicable to procedures such as mitral valve replacement and atrial septal defect repair, as well as ablation therapies for treatment of atrial fibrillation. Our goal is to develop a robust augmented reality system that will improve the efficacy of intracardiac treatments and broaden the range of cardiac surgeries that can be performed in a minimally invasive manner. This paper provides an overview of our interventional system and specific experiments that assess its pre-clinical performance.


Medical Imaging 2007: Visualization and Image-Guided Procedures | 2007

An augmented reality environment for image-guidance of off-pump mitral valve implantation

Cristian A. Linte; Andrew D. Wiles; Nicholas A. Hill; John Moore; Chris Wedlake; Gerard M. Guiraudon; Douglas L. Jones; Daniel Bainbridge; Terry M. Peters

Clinical research has been rapidly evolving towards the development of less invasive surgical procedures. We recently embarked on a project to improve intracardiac beating heart interventions. Our novel approach employs new surgical technologies and support from image-guidance via pre-operative and intra-operative imaging (i.e. two-dimensional echocardiography) to substitute for direct vision. Our goal was to develop a versatile system that allowed for safe cardiac port access, and provide sufficient image-guidance with the aid of a virtual reality environment to substitute for the absence of direct vision, while delivering quality therapy to the target. Specific targets included the repair and replacement of heart valves and the repair of septal defects. The ultimate objective was to duplicate the success rate of conventional open-heart surgery, but to do so via a small incision, and to evaluate the efficacy of the procedure as it is performed. This paper describes the software and hardware components, along with the methodology for performing mitral valve replacement as one example of this approach, using ultrasound and virtual tool models to position and fasten the valve in place.


Computerized Medical Imaging and Graphics | 2013

On mixed reality environments for minimally invasive therapy guidance: systems architecture, successes and challenges in their implementation from laboratory to clinic.

Cristian A. Linte; Katherine P. Davenport; Kevin Cleary; Craig A. Peters; Kirby G. Vosburgh; Nassir Navab; Philip “Eddie” Edwards; Pierre Jannin; Terry M. Peters; David R. Holmes; Richard A. Robb

Mixed reality environments for medical applications have been explored and developed over the past three decades in an effort to enhance the clinicians view of anatomy and facilitate the performance of minimally invasive procedures. These environments must faithfully represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical instrument tracking, and display technology into a common framework centered around and registered to the patient. However, in spite of their reported benefits, few mixed reality environments have been successfully translated into clinical use. Several challenges that contribute to the difficulty in integrating such environments into clinical practice are presented here and discussed in terms of both technical and clinical limitations. This article should raise awareness among both developers and end-users toward facilitating a greater application of such environments in the surgical practice of the future.


medical image computing and computer assisted intervention | 2007

Towards subject-specific models of the dynamic heart for image-guided mitral valve surgery

Cristian A. Linte; Marcin Wierzbicki; John Moore; Stephen H. Little; Gerard M. Guiraudon; Terry M. Peters

Surgeons need a robust interventional system capable of providing reliable, real-time information regarding the position and orientation of the surgical targets and tools to compensate for the lack of direct vision and to enhance manipulation of intracardiac targets during minimally-invasive, off-pump cardiac interventions. In this paper, we describe a novel method for creating dynamic, pre-operative, subject-specific cardiac models containing the surgical targets and surrounding anatomy, and how they are used to augment the intra-operative virtual environment for guidance of valvular interventions. The accuracy of these pre-operative models was established by comparing the target registration error between the mitral valve annulus characterized in the pre-operative images and their equivalent structures manually extracted from 3D US data. On average, the mitral valve annulus was extracted with a 3.1 mm error across all cardiac phases. In addition, we also propose a method for registering the pre-operative models into the intra-operative virtual environment.


computer assisted radiology and surgery | 2012

Accuracy considerations in image-guided cardiac interventions: experience and lessons learned

Cristian A. Linte; Pencilla Lang; Maryam E. Rettmann; Daniel S. Cho; David R. Holmes; Richard A. Robb; Terry M. Peters

MotivationMedical imaging and its application in interventional guidance has revolutionized the development of minimally invasive surgical procedures leading to reduced patient trauma, fewer risks, and shorter recovery times. However, a frequently posed question with regard to an image guidance system is “how accurate is it?” On one hand, the accuracy challenge can be posed in terms of the tolerable clinical error associated with the procedure; on the other hand, accuracy is bound by the limitations of the system’s components, including modeling, patient registration, and surgical instrument tracking, all of which ultimately impact the overall targeting capabilities of the system.MethodsWhile these processes are not unique to any interventional specialty, this paper discusses them in the context of two different cardiac image guidance platforms: a model-enhanced ultrasound platform for intracardiac interventions and a prototype system for advanced visualization in image-guided cardiac ablation therapy.ResultsPre-operative modeling techniques involving manual, semi-automatic and registration-based segmentation are discussed. The performance and limitations of clinically feasible approaches for patient registration evaluated both in the laboratory and in the operating room are presented. Our experience with two different magnetic tracking systems for instrument and ultrasound transducer localization is reported. Ultimately, the overall accuracy of the systems is discussed based on both in vitro and preliminary in vivo experience.ConclusionWhile clinical accuracy is specific to a particular patient and procedure and vastly dependent on the surgeon’s experience, the system’s engineering limitations are critical to determine whether the clinical requirements can be met.


IEEE Reviews in Biomedical Engineering | 2010

Virtual and Augmented Medical Imaging Environments: Enabling Technology for Minimally Invasive Cardiac Interventional Guidance

Cristian A. Linte; James A. White; Roy Eagleson; Gerard M. Guiraudon; Terry M. Peters

Virtual and augmented reality environments have been adopted in medicine as a means to enhance the clinicians view of the anatomy and facilitate the performance of minimally invasive procedures. Their value is truly appreciated during interventions where the surgeon cannot directly visualize the targets to be treated, such as during cardiac procedures performed on the beating heart. These environments must accurately represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical tracking, and visualization technology in a common framework centered around the patient. This review begins with an overview of minimally invasive cardiac interventions, describes the architecture of a typical surgical guidance platform including imaging, tracking, registration and visualization, highlights both clinical and engineering accuracy limitations in cardiac image guidance, and discusses the translation of the work from the laboratory into the operating room together with typically encountered challenges.


international conference on medical imaging and augmented reality | 2008

Towards a Medical Virtual Reality Environment for Minimally Invasive Cardiac Surgery

Terry M. Peters; Cristian A. Linte; John Moore; Daniel Bainbridge; Douglas L. Jones; Gerard M. Guiraudon

We have developed a visualization environment to assist surgeons with therapy delivery inside the beating heart, in absence of direct vision. This system employs virtual reality techniques to integrate pre-operative anatomical models, real-time intra-operative imaging, and models of magnetically-tracked surgical tools. Visualization is enhanced via 3D dynamic cardiac models constructed from high-resolution pre-operative MR or CT data and registered within the intra-operative imaging environment. In this paper, we report our experience with a feature-based registration technique to fuse the pre- and intra-operative data during an in vivointracardiac procedure on a porcine subject. Good alignment of the pre- and intra-operative anatomy within the virtual reality environment is ensured through the registration of easily identifiable landmarks. We present our initial experience in translating this work into the operating room and employing this system to guide typical intracardiac interventions. Given its extensive capabilities in providing surgical guidance in the absence of direct vision, our virtual environment is an ideal candidate for performing off-pump intracardiac interventions.


international conference of the ieee engineering in medicine and biology society | 2007

On Enhancing Planning and Navigation of Beating-Heart Mitral Valve Surgery Using Pre-operative Cardiac Models

Cristian A. Linte; Marcin Wierzbicki; John Moore; Gerard M. Guiraudon; Douglas L. Jones; Terry Peters

In an effort to reduce morbidity during minimally- invasive cardiac procedures, we have recently developed an interventional technique targeted towards off-pump cardiac interventions. To compensate for the absence of direct visualization, our system employs a virtual reality environment for image guidance, that integrates pre-operative information with real-time intra-operative imaging and surgical tool tracking. This work focuses on enhancing intracardiac visualization and navigation by overlaying pre-operative cardiac models onto the intra-operative virtual space, to display surgical targets within their specific anatomical context. Our method for integrating pre-operative data into the intra-operative environment is accurate within ~5.0 mm. Thus, we feel that our virtually-augmented surgical space is accurate enough to improve spatial orientation and intracardiac navigation.


IEEE Transactions on Biomedical Engineering | 2010

Evaluation of Model-Enhanced Ultrasound-Assisted Interventional Guidance in a Cardiac Phantom

Cristian A. Linte; John Moore; Chris Wedlake; Terry M. Peters

Minimizing invasiveness associated with cardiac procedures has led to limited visual access to the target tissues. To address these limitations, we have developed a visualization environment that integrates interventional ultrasound (US) imaging with preoperative anatomical models and virtual representations of the surgical instruments tracked in real time. In this paper, we present a comprehensive evaluation of our model-enhanced US-guidance environment by simulating clinically relevant interventions in vitro . We have demonstrated that model-enhanced US guidance provides a clinically desired targeting accuracy better than 3-mm rms and maintains this level of accuracy even in the case of image-to-patient misalignments that are often encountered in the clinic. These studies emphasize the benefits of integrating real-time imaging with preoperative data to enhance surgical navigation in the absence of direct vision during minimally invasive cardiac interventions.


Medical Imaging 2008: Visualization, Image-Guided Procedures, and Modeling | 2008

Object identification accuracy under ultrasound enhanced virtual reality for minimally invasive cardiac surgery

Andrew D. Wiles; John Moore; Cristian A. Linte; Christopher Wedlake; Anis Ahmad; Terry M. Peters

A 2D ultrasound enhanced virtual reality surgical guidance system has been under development for some time in our lab. The new surgical guidance platform has been shown to be effective in both the laboratory and clinical settings, however, the accuracy of the tracked 2D ultrasound has not been investigated in detail in terms of the applications for which we intend to use it (i.e., mitral valve replacement and atrial septal defect closure). This work focuses on the development of an accuracy assessment protocol specific to the assessment of the calibration methods used to determine the rigid transformation between the ultrasound image and the tracked sensor. Specifically, we test a Z-bar phantom calibration method and a phantomless calibration method and compared the accuracy of tracking ultrasound images from neuro, transesophageal, intracardiac and laparoscopic ultrasound transducers. This work provides a fundamental quantitative description of the image-guided accuracy that can be obtained with this new surgical guidance system.

Collaboration


Dive into the Cristian A. Linte's collaboration.

Top Co-Authors

Avatar

Terry M. Peters

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

John Moore

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Wedlake

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shusil Dangi

Rochester Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew D. Wiles

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Daniel Bainbridge

University of Western Ontario

View shared research outputs
Researchain Logo
Decentralizing Knowledge