Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Chris Wedlake is active.

Publication


Featured researches published by Chris Wedlake.


Computer Aided Surgery | 2008

Virtual reality-enhanced ultrasound guidance: a novel technique for intracardiac interventions.

Cristian A. Linte; John Moore; Andrew D. Wiles; Chris Wedlake; Terry M. Peters

Cardiopulmonary bypass surgery, although a highly invasive interventional approach leading to numerous complications, is still the most common therapy option for treating many forms of cardiac disease. We are currently engaged in a project designed to replace many bypass surgeries with less traumatic, minimally invasive intracardiac therapies. This project combines real-time intra-operative echocardiography with a virtual reality environment providing the surgeon with a broad range of valuable information. Pre-operative images, electrophysiological data, positions of magnetically tracked surgical instruments, and dynamic surgical target representations are among the data that can be presented to the surgeon to augment intra-operative ultrasound images. This augmented reality system is applicable to procedures such as mitral valve replacement and atrial septal defect repair, as well as ablation therapies for treatment of atrial fibrillation. Our goal is to develop a robust augmented reality system that will improve the efficacy of intracardiac treatments and broaden the range of cardiac surgeries that can be performed in a minimally invasive manner. This paper provides an overview of our interventional system and specific experiments that assess its pre-clinical performance.


medical image computing and computer assisted intervention | 2010

Fused video and ultrasound images for minimally invasive partial nephrectomy: a phantom study

Carling L. Cheung; Chris Wedlake; John Moore; Stephen E. Pautler; Terry M. Peters

The shift to minimally invasive abdominal surgery has increased reliance on image guidance during surgical procedures. However, these images are most often presented independently, increasing the cognitive workload for the surgeon and potentially increasing procedure time. When warm ischemia of an organ is involved, time is an important factor to consider. To address these limitations, we present a more intuitive visualization that combines images in a common augmented reality environment. In this paper, we assess surgeon performance under the guidance of the conventional visualization system and our fusion system using a phantom study that mimics the tumour resection of partial nephrectomy. The RMS error between the fused images was 2.43mm, which is sufficient for our purposes. A faster planning time for the resection was achieved using our fusion visualization system. This result is a positive step towards decreasing risks associated with long procedure times in minimally invasive abdominal interventions.


medical image computing and computer assisted intervention | 2009

Image Guidance for Spinal Facet Injections Using Tracked Ultrasound

John Moore; Colin Clarke; Daniel Bainbridge; Chris Wedlake; Andrew D. Wiles; Danielle F. Pace; Terry M. Peters

Anesthetic nerve blocks are a common therapy performed in hospitals around the world to alleviate acute and chronic pain. Tracking systems have shown considerable promise in other forms of therapy, but little has been done to apply this technology in the field of anesthesia. We are developing a guidance system for combining tracked needles with non-invasive ultrasound (US) and patient-specific geometric models. In experiments with phantoms two augmented reality (AR) guidance systems were compared to the exclusive use of US for lumbar facet injection therapy. Anesthetists and anesthesia residents were able to place needles within 0.57 mm of the intended targets using our AR systems compared to 5.77 mm using US alone. A preliminary cadaver study demonstrated the system was able to accurately place radio opaque dye on targets. The combination of real time US with tracked tools and AR guidance has the potential to replace CT and fluoroscopic guidance, thus reducing radiation dose to patients and clinicians, as well as reducing health care costs.


Medical Imaging 2007: Visualization and Image-Guided Procedures | 2007

An augmented reality environment for image-guidance of off-pump mitral valve implantation

Cristian A. Linte; Andrew D. Wiles; Nicholas A. Hill; John Moore; Chris Wedlake; Gerard M. Guiraudon; Douglas L. Jones; Daniel Bainbridge; Terry M. Peters

Clinical research has been rapidly evolving towards the development of less invasive surgical procedures. We recently embarked on a project to improve intracardiac beating heart interventions. Our novel approach employs new surgical technologies and support from image-guidance via pre-operative and intra-operative imaging (i.e. two-dimensional echocardiography) to substitute for direct vision. Our goal was to develop a versatile system that allowed for safe cardiac port access, and provide sufficient image-guidance with the aid of a virtual reality environment to substitute for the absence of direct vision, while delivering quality therapy to the target. Specific targets included the repair and replacement of heart valves and the repair of septal defects. The ultimate objective was to duplicate the success rate of conventional open-heart surgery, but to do so via a small incision, and to evaluate the efficacy of the procedure as it is performed. This paper describes the software and hardware components, along with the methodology for performing mitral valve replacement as one example of this approach, using ultrasound and virtual tool models to position and fasten the valve in place.


medical image computing and computer assisted intervention | 2003

Laser Projection Augmented Reality System for Computer Assisted Surgery

Neil Glossop; Chris Wedlake; John Moore; Terry M. Peters; Zhanhe Wang

A new augmented reality apparatus was evaluated. The device uses scanned infrared and visible lasers to project computer generated information such as surgical plans, entry pints for probes etc, directly onto the patient. In addition to projecting the plan, the device can be integrated with a 3D camera and is capable of measuring the location of projected infrared laser spots. This can be used to ensure that the display is accurate, apply corrections to the projection path and to assist in registration. The projection system has its own Application Programmer’s Interface (API) and is a stand-alone add-on unit to any host computer system. Tests were conducted to evaluate the accuracy and repeatability of the system. We compared the locations of points projected on a flat surface with the measurements obtained from a tracked probe. The surface was rotated through 60 degrees in 5 degree increments and locations measured from the two devices agreed to within 2mm. An initial host application was also developed to demonstrate the new unit. Fiducials representing vertices along a proposed craniotomy were embedded into a plastic skull and a projection path defining the craniotomy was calculated. A feedback-based optimization of the plan was performed by comparing the measurement taken by the camera of these coordinates. The optimized plan was projected onto the skull. On average, the projection deviated by approximately 1mm from the plan. Applications include identification of critical anatomical structures, visualization of preplanned paths and targets, and telesurgery or teleconsultation.


IEEE Transactions on Biomedical Engineering | 2010

Evaluation of Model-Enhanced Ultrasound-Assisted Interventional Guidance in a Cardiac Phantom

Cristian A. Linte; John Moore; Chris Wedlake; Terry M. Peters

Minimizing invasiveness associated with cardiac procedures has led to limited visual access to the target tissues. To address these limitations, we have developed a visualization environment that integrates interventional ultrasound (US) imaging with preoperative anatomical models and virtual representations of the surgical instruments tracked in real time. In this paper, we present a comprehensive evaluation of our model-enhanced US-guidance environment by simulating clinically relevant interventions in vitro . We have demonstrated that model-enhanced US guidance provides a clinically desired targeting accuracy better than 3-mm rms and maintains this level of accuracy even in the case of image-to-patient misalignments that are often encountered in the clinic. These studies emphasize the benefits of integrating real-time imaging with preoperative data to enhance surgical navigation in the absence of direct vision during minimally invasive cardiac interventions.


international conference information processing | 2013

A Navigation Platform for Guidance of Beating Heart Transapical Mitral Valve Repair

John Moore; Michael W.A. Chu; Bob Kiaii; Daniel Bainbridge; Gerard M. Guiraudon; Chris Wedlake; Maria E. Currie; Martin Rajchl; Rajni V. Patel; Terry M. Peters

Traditional surgical approaches for repairing diseased mitral valves (MVs) have relied on placing the patient on cardiopulmonary bypass (on pump), stopping the heart and accessing the arrested heart directly. However, because this approach has the potential for adverse neurological, vascular, and immunological sequelae, less invasive beating heart alternatives are desirable. Emerging beating heart techniques have been developed to offer high-risk patients MV repair using ultrasound guidance alone without stopping the heart. This paper describes the first porcine trials of the NeoChord DS1000 (Minnetonka, MN), employed to attach neochordae to a MV leaflet using the traditional ultrasound-guided protocol augmented by dynamic virtual geometric models. The distance errors of the tracked tool tip from the intended midline trajectory (5.2 ± 2.4 mm versus 16.8 ± 10.9 mm, p = 0.003), navigation times (16.7 ± 8.0 s versus 92.0 ± 84.5 s, p = 0.004), and total path lengths (225.2 ± 120.3 mm versus 1128.9 ± 931.1 mm, p = 0.003) were significantly shorter in the augmented ultrasound compared to navigation with ultrasound alone,1 indicating a substantial improvement in the safety and simplicity of the procedure.


AE-CAI | 2013

The Role of Augmented Reality in Training the Planning of Brain Tumor Resection

Kamyar Abhari; John S. H. Baxter; Elvis C. S. Chen; Ali R. Khan; Chris Wedlake; Terry M. Peters; Roy Eagleson; Sandrine de Ribaupierre

The environment in which a surgeons is trained profoundly effects their preferred method for visualizing patient images. While classical 2D viewing might be preferred by some older experts, the new generation of residents and novices has been raised navigating in 3D through video games, and are accustomed to seeing 3D reconstructions of the human anatomy. In this study, we evaluate the performance of different groups of users in 4 different visualization modalities (2D planes, orthogonal planes, 3D reconstruction and augmented reality). We hypothesize that this system will facilitate the spatio-visual abilities of individuals in terms of assessing patient-specific data, an essential requirement of many neurosurgical applications such as tumour resection. We also hypothesize that the difference between AR and the other modalities will be greater in the novice group. Our preliminary results indicate that AR is better or as good as other modalities in terms of performance.


medical image computing and computer-assisted intervention | 2008

Dynamic Cardiac Mapping on Patient-Specific Cardiac Models

Kevin J. Wilson; Gerard M. Guiraudon; Douglas L. Jones; Cristian A. Linte; Chris Wedlake; John Moore; Terry M. Peters

Minimally invasive techniques for electrophysiological cardiac data mapping and catheter ablation therapy have been driven through advancements in computer-aided technologies, including magnetic tracking systems, and virtual and augmented-reality environments. The objective of this work is to extend current cardiac mapping techniques to collect and display data in the temporal domain, while mapping on patient-specific cardiac models. This paper details novel approaches to collecting spatially tracked cardiac electrograms, registering the data with a patient-specific cardiac model, and interpreting the data directly on the model surface, with the goal of giving a more comprehensive cardiac mapping system in comparison to current systems. To validate the system, laboratory studies were conducted to assess the accuracy of navigating to both physical and virtual landmarks. Subsequent to the laboratory studies, an in-vivo porcine experiment was conducted to assess the systems overall ability to collect spatial tracked electrophysiological data, and map directly onto a cardiac model. The results from these experiments show the new dynamic cardiac mapping system was able to maintain high accuracy of locating physical and virtual landmarks, while creating a dynamic cardiac map displayed on a dynamic cardiac surface model.


international conference on robotics and automation | 2010

Preoperative planning of robotics-assisted minimally invasive coronary artery bypass grafting

Hamidreza Azimian; Jeremy Breetzke; Ana Luisa Trejos; Rajni V. Patel; Michael D. Naish; Terry M. Peters; John Moore; Chris Wedlake; Bob Kiaii

This paper outlines a framework for the preoperative planning of robotics-assisted minimally invasive cardiac surgery with application to coronary artery bypass grafting. The intent of the proposed framework is to improve surgical outcomes by considering the intraoperative requirements of the robotic manipulators and the anatomical geometry of the patients chest. This includes target reachability, instrument dexterity for critical surgical tasks and collision avoidance. Given the patients preoperative chest computed tomography images, the planning framework aims to determine the optimal location of the access ports on the ribcage, along with the optimal pose of the robotic arms relative to the patients anatomy. The proposed multi-objective optimality criteria consist of a measure of clearance as well as a new collective kinematic measure. The minimum distances among the robot arms provides a measure for the likelihood of collisions. The proposed kinematic measure is composed of two modified manipulability indices that are dimensionally homogeneous and, in contrast to previously-used measures, are more likely to yield isotropic force and torque distributions when optimized for surgical interventions. The results of a case study illustrate the compatibility of the framework with general guidelines used by experienced surgeons for port selection. Furthermore, the framework surpasses those guidelines by ensuring the feasibility of the solutions in the sense of collision avoidance and surgical target reachability.

Collaboration


Dive into the Chris Wedlake's collaboration.

Top Co-Authors

Avatar

Terry M. Peters

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

John Moore

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Cristian A. Linte

Rochester Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Bainbridge

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Andrew D. Wiles

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Douglas L. Jones

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Bob Kiaii

Lawson Health Research Institute

View shared research outputs
Top Co-Authors

Avatar

Elvis C. S. Chen

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Rajni V. Patel

University of Western Ontario

View shared research outputs
Researchain Logo
Decentralizing Knowledge