Benjamin Frisch
Technische Universität München
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Benjamin Frisch.
computer assisted radiology and surgery | 2015
Oliver Zettinig; Amit Shah; Christoph Hennersperger; Matthias Eiber; Christine Kroll; Hubert Kübler; Tobias Maurer; Fausto Milletari; Julia Rackerseder; Christian Schulte zu Berge; Enno Storz; Benjamin Frisch; Nassir Navab
PurposeTransrectal ultrasound (TRUS)-guided random prostate biopsy is, in spite of its low sensitivity, the gold standard for the diagnosis of prostate cancer. The recent advent of PET imaging using a novel dedicated radiotracer,
IEEE Transactions on Medical Imaging | 2017
Christoph Hennersperger; Bernhard Fuerst; Salvatore Virga; Oliver Zettinig; Benjamin Frisch; Thomas Neff; Nassir Navab
international conference on robotics and automation | 2016
Oliver Zettinig; Bernhard Fuerst; Risto Kojcev; Marco Esposito; Mehrdad Salehi; Wolfgang Wein; Julia Rackerseder; Edoardo Sinibaldi; Benjamin Frisch; Nassir Navab
^{68}\hbox {Ga}
IEEE Transactions on Medical Imaging | 2016
Bernhard Fuerst; Julian Sprung; Francisco Pinto; Benjamin Frisch; Thomas Wendler; Hervé Simon; Laurent Mengus; Nynke S. van den Berg; Henk G. van der Poel; Fijs W. B. van Leeuwen; Nassir Navab
medical image computing and computer assisted intervention | 2015
Marco Esposito; Benjamin Busam; Christoph Hennersperger; Julia Rackerseder; An Lu; Nassir Navab; Benjamin Frisch
68Ga-labeled prostate-specific membrane antigen (PSMA), combined with MRI provides improved pre-interventional identification of suspicious areas. This work proposes a multimodal fusion image-guided biopsy framework that combines PET-MRI images with TRUS, using automatic segmentation and registration, and offering real-time guidance.MethodsThe prostate TRUS images are automatically segmented with a Hough transform-based random forest approach. The registration is based on the Coherent Point Drift algorithm to align surfaces elastically and to propagate the deformation field calculated from thin-plate splines to the whole gland.ResultsThe method, which has minimal requirements and temporal overhead in the existing clinical workflow, is evaluated in terms of surface distance and landmark registration error with respect to the clinical ground truth. Evaluations on agar–gelatin phantoms and clinical data of 13 patients confirm the validity of this approach.ConclusionThe system is able to successfully map suspicious regions from PET/MRI to the interventional TRUS image.
computer assisted radiology and surgery | 2016
Risto Kojcev; Bernhard Fuerst; Oliver Zettinig; Javad Fotouhi; Sing Chun Lee; Benjamin Frisch; Russell H. Taylor; Edoardo Sinibaldi; Nassir Navab
Robotic ultrasound has the potential to assist and guide physicians during interventions. In this work, we present a set of methods and a workflow to enable autonomous MRI-guided ultrasound acquisitions. Our approach uses a structured-light 3D scanner for patient-to-robot and image-to-patient calibration, which in turn is used to plan 3D ultrasound trajectories. These MRI-based trajectories are followed autonomously by the robot and are further refined online using automatic MRI/US registration. Despite the low spatial resolution of structured light scanners, the initial planned acquisition path can be followed with an accuracy of 2.46 ± 0.96 mm. This leads to a good initialization of the MRI/US registration: the 3D-scan-based alignment for planning and acquisition shows an accuracy (distance between planned ultrasound and MRI) of 4.47 mm, and 0.97 mm after an online-update of the calibration based on a closed loop registration.Robotic ultrasound has the potential to assist and guide physicians during interventions. In this work, we present a set of methods and a workflow to enable autonomous MRI-guided ultrasound acquisitions. Our approach uses a structured-light 3D scanner for patient-to-robot and image-to-patient calibration, which in turn is used to plan 3D ultrasound trajectories. These MRI-based trajectories are followed autonomously by the robot and are further refined online using automatic MRI/US registration. Despite the low spatial resolution of structured light scanners, the initial planned acquisition path can be followed with an accuracy of 2.46 ± 0.96 mm. This leads to a good initialization of the MRI/US registration: the 3D-scan-based alignment for planning and acquisition shows an accuracy (distance between planned ultrasound and MRI) of 4.47 mm, and 0.97 mm after an online-update of the calibration based on a closed loop registration.
Medical Image Analysis | 2016
Nassir Navab; Christoph Hennersperger; Benjamin Frisch; Bernhard Fürst
While intraoperative imaging is commonly used to guide surgical interventions, automatic robotic support for image-guided navigation has not yet been established in clinical routine. In this paper, we propose a novel visual servoing framework that combines, for the first time, full image-based 3D ultrasound registration with a real-time servo-control scheme. Paired with multi-modal fusion to a pre-interventional plan such as an annotated needle insertion path, it thus allows tracking a target anatomy, continuously updating the plan as the target moves, and keeping a needle guide aligned for accurate manual insertion. The presented system includes a motorized 3D ultrasound transducer mounted on a force-controlled robot and a GPU-based image processing toolkit. The tracking accuracy of our framework is validated on a geometric agar/gelatin phantom using a second robot, achieving positioning errors of on average 0.42-0.44 mm. With compounding and registration runtimes of up to total around 550 ms, real-time performance comes into reach. We also present initial results on a spine phantom, demonstrating the feasibility of our system for lumbar spine injections.
international conference on computer vision | 2015
Benjamin Busam; Marco Esposito; Simon Che'Rose; Nassir Navab; Benjamin Frisch
In this paper we present the usage of a drop-in gamma probe for intra-operative Single-Photon Emission Computed Tomography (SPECT) imaging in the scope of minimally invasive robot-assisted interventions. The probe is designed to be inserted and reside inside the abdominal cavity during the intervention. It is grasped during the procedure using a robotic laparoscopic gripper enabling full six degrees of freedom handling by the surgeon. We demonstrate the first deployment of the tracked probe for intra-operative in-patient robotic SPECT enabling augmented-reality image guidance. The hybrid mechanical- and image-based in-patient probe tracking is shown to have an accuracy of 0.2 mm. The overall system performance is evaluated and tested with a phantom for gynecological sentinel lymph node interventions and compared to ground-truth data yielding a mean reconstruction accuracy of 0.67 mm.
medical image computing and computer assisted intervention | 2014
José Gardiazabal; Marco Esposito; Philipp Matthies; Asli Okur; Jakob Vogel; Silvan Kraft; Benjamin Frisch; Tobias Lasser; Nassir Navab
Sentinel lymph node (sLN) biopsy mostly requires an invasive surgical intervention to remove sLNs under radioguidance. We present an alternative method where live ultrasound is combined with live robotic gamma imaging to provide real-time anatomical and nuclear guidance of punch biopsies. The robotic arm holding a gamma camera is equipped with a system for inside-out tracking to directly retrieve the relative position of the US transducer with respect to itself. Based on this, the system cooperatively positions the gamma camera parallel to the US imaging plane selected by the physician for real-time multi-modal visualization. We validate the feasibility of this approach with a dedicated gelatine/agar biopsy phantom and show that lymph nodes separated by at least 10 mm can be distinguished.
Workshop on Clinical Image-Based Procedures | 2014
Amit Shah; Oliver Zettinig; Tobias Maurer; Cristina Precup; Christian Schulte zu Berge; Jakob Weiss; Benjamin Frisch; Nassir Navab
PurposePrecise needle placement is an important task during several medical procedures. Ultrasound imaging is often used to guide the needle toward the target region in soft tissue. This task remains challenging due to the user’s dependence on image quality, limited field of view, moving target, and moving needle. In this paper, we present a novel dual-robot framework for robotic needle insertions under robotic ultrasound guidance.MethodWe integrated force-controlled ultrasound image acquisition, registration of preoperative and intraoperative images, vision-based robot control, and target localization, in combination with a novel needle tracking algorithm. The framework allows robotic needle insertion to target a preoperatively defined region of interest while enabling real-time visualization and adaptive trajectory planning to provide safe and quick interactions. We assessed the framework by considering both static and moving targets embedded in water and tissue-mimicking gelatin.ResultsThe presented dual-robot tracking algorithms allow for accurate needle placement, namely to target the region of interest with an error around 1 mm.ConclusionTo the best of our knowledge, we show the first use of two independent robots, one for imaging, the other for needle insertion, that are simultaneously controlled using image processing algorithms. Experimental results show the feasibility and demonstrate the accuracy and robustness of the process.