Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ozgur Guler is active.

Publication


Featured researches published by Ozgur Guler.


Medical Physics | 2013

Quantitative error analysis for computer assisted navigation: a feasibility study

Ozgur Guler; M. Perwög; Florian Kral; F. Schwarm; Z. R. Bárdosi; G. Göbel; Wolfgang Freysinger

PURPOSE The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features-the user localization error (ULE)-is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. METHODS Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials), and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root-mean-square, RMS; probe tip calibration was 0.18 mm RMS). Variances of tracking along the principal directions were measured as 0.18 mm(2), 0.32 mm(2), and 0.42 mm(2). ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. RESULTS The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. CONCLUSIONS Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials. Submillimetric localization is possible for implanted screws; anatomic landmarks are not suitable for high-precision clinical navigation.


international conference of the ieee engineering in medicine and biology society | 2012

Image-guided navigation: A cost effective practical introduction using the image-guided surgery toolkit (IGSTK)

Ozgur Guler; Ziv Yaniv

Teaching the key technical aspects of image-guided interventions using a hands-on approach is a challenging task. This is primarily due to the high cost and lack of accessibility to imaging and tracking systems. We provide a software and data infrastructure which addresses both challenges. Our infrastructure allows students, patients, and clinicians to develop an understanding of the key technologies by using them, and possibly by developing additional components and integrating them into a simple navigation system which we provide. Our approach requires minimal hardware, LEGO blocks to construct a phantom for which we provide CT scans, and a webcam which when combined with our software provides the functionality of a tracking system. A premise of this approach is that tracking accuracy is sufficient for our purpose. We evaluate the accuracy provided by a consumer grade webcam and show that it is sufficient for educational use. We provide an open source implementation of all the components required for a basic image-guided navigation as part of the Image-Guided Surgery Toolkit (IGSTK). It has long been known that in education there is no substitute for hands-on experience, to quote Sophocles, “One must learn by doing the thing; for though you think you know it, you have no certainty, until you try.”. Our work provides this missing capability in the context of image-guided navigation. Enabling a wide audience to learn and experience the use of a navigation system.


medical image computing and computer assisted intervention | 2014

Improved Screw Placement for Slipped Capital Femoral Epiphysis (SCFE) Using Robotically-Assisted Drill Guidance

Bamshad Azizi Koutenaei; Ozgur Guler; Emmanuel Wilson; Ramesh U. Thoranaghatte; Matthew E. Oetgen; Nassir Navab; Kevin Cleary

Slipped Capital Femoral Epiphysis (SCFE) is a common hip displacement condition in adolescents. In the standard treatment, the surgeon uses intra-operative fluoroscopic imaging to plan the screw placement and the drill trajectory. The accuracy, duration, and efficacy of this procedure are highly dependent on surgeon skill. Longer procedure times result in higher radiation dose, to both patient and surgeon. A robotic system to guide the drill trajectory might help to reduce screw placement errors and procedure time by reducing the number of passes and confirmatory fluoroscopic images needed to verify accurate positioning of the drill guide along a planned trajectory. Therefore, with the long-term goals of improving screw placement accuracy, reducing procedure time and intra-operative radiation dose, our group is developing an image-guided robotic surgical system to assist a surgeon with pre-operative path planning and intra-operative drill guide placement.


Proceedings of SPIE | 2013

Interactive initialization for 2D/3D intra-operative registration using the Microsoft Kinect

Ren Hui Gong; Ozgur Guler; Ziv Yaniv

All 2D/3D anatomy based rigid registration algorithms are iterative, requiring an initial estimate of the 3D data pose. Current initialization methods have limited applicability in the operating room setting, due to the constraints imposed by this environment or due to insufficient accuracy. In this work we use the Microsoft Kinect device to allow the surgeon to interactively initialize the registration process. A Kinect sensor is used to simulate the mouse-based operations in a conventional manual initialization approach, obviating the need for physical contact with an input device. Different gestures from both arms are detected from the sensor in order to set or switch the required working contexts. 3D hand motion provides the six degree-of-freedom controls for manipulating the pre-operative data in the 3D space. We evaluated our method for both X-ray/CT and X-ray/MR initialization using three publicly available reference data sets. Results show that, with initial target registration errors of 117:7 ± 28:9 mm a user is able to achieve final errors of 5:9 ± 2:6 mm within 158 ± 65 sec using the Kinect-based approach, compared to 4:8±2:0 mm and 88±60 sec when using the mouse for interaction. Based on these results we conclude that this method is sufficiently accurate for initialization of X-ray/CT and X-ray/MR registration in the OR.


medical image computing and computer assisted intervention | 2015

Inertial Measurement Unit for Radiation-Free Navigated Screw Placement in Slipped Capital Femoral Epiphysis Surgery

Bamshad Azizi Koutenaei; Ozgur Guler; Emmanuel Wilson; Matthew E. Oetgen; Patrick Grimm; Nassir Navab; Kevin Cleary

Slipped Capital Femoral Epiphysis SCFE is a common pathologic hip condition in adolescents. In the standard treatment, a surgeon relies on multiple intra-operative fluoroscopic X-ray images to plan the screw placement and to guide a drill along the intended trajectory. More complex cases could require more images, and thereby, higher radiation dose to both patient and surgeon. We introduce a novel technique using an Inertial Measurement Unit IMU for recovering and visualizing the orthopedic tool trajectory in two orthogonal Xray images in real-time. The proposed technique improves screw placement accuracy and reduces the number of required fluoroscopic X-ray images without changing the current workflow. We present results from a phantom study using 20 bones to perform drilling and screw placement tasks. While dramatically reducing the number of required fluoroscopic images from 20 to 4, the results also show improvement in accuracy compared to the manual SCFE approach.


Medical Physics | 2013

Interactive initialization of 2D/3D rigid registration

Ren Hui Gong; Ozgur Guler; Mustafa Kurkluoglu; John F. Lovejoy; Ziv Yaniv


Archive | 2014

Method and system for wound assessment and management

Kyle Wu; Peng Cheng; Peter C.W. Kim; Ozgur Guler


Journal of The American College of Surgeons | 2014

Mobile Wound Assessment using Novel Computer Vision Methods

Kyle Wu; Ozgur Guler; Peng Cheng; Peter C.W. Kim


Archive | 2014

Verfahren und system zur beurteilung und behandlung einer wunde

Kyle Wu; Peter C.W. Kim; Patrick Cheng; Ozgur Guler


Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | 2014

Improved screw placement for Slipped Capital Femoral Epiphysis (SCFE) using robotically-assisted drill guidance

Bamshad Azizi Koutanaei; Ozgur Guler; Emmanuel Wilson; Ramesh U. Thoranaghatte; Matthew E. Oetgen; Nassir Navab; Kevin Cleary

Collaboration


Dive into the Ozgur Guler's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kyle Wu

Children's National Medical Center

View shared research outputs
Top Co-Authors

Avatar

Matthew E. Oetgen

Children's National Medical Center

View shared research outputs
Top Co-Authors

Avatar

Peter C.W. Kim

Children's National Medical Center

View shared research outputs
Top Co-Authors

Avatar

Ziv Yaniv

Children's National Medical Center

View shared research outputs
Top Co-Authors

Avatar

Bamshad Azizi Koutenaei

Children's National Medical Center

View shared research outputs
Top Co-Authors

Avatar

Peng Cheng

Children's National Medical Center

View shared research outputs
Top Co-Authors

Avatar

Ren Hui Gong

Children's National Medical Center

View shared research outputs
Top Co-Authors

Avatar

John F. Lovejoy

Children's National Medical Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge