Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Reza Seifabadi is active.

Publication


Featured researches published by Reza Seifabadi.


international conference on robotics and automation | 2016

A continuum manipulator with phase changing alloy

Farshid Alambeigi; Reza Seifabadi; Mehran Armand

A new type of cable-driven continuum manipulator (CM) is presented, in which the stiffness of the device along its body length can be controlled using the thermomechanical properties of a phase changing alloy. The liquid phase of the alloy is used for achieving high dexterity and the solid phase for high stiffness. Joule heating and water cooling is used for transitioning the phase changing alloy between stiff and compliant states. Single-segment and two-segment working prototypes of the CM are demonstrated. The mechanical and thermodynamic features of these prototypes are discussed and their physical performance is investigated. Advantages of the presented design with phase changing alloy include: significantly improved dexterity, high payload to weight ratio, controllable stiffness, energy efficiency, and a large lumen.


ieee international conference on biomedical robotics and biomechatronics | 2014

A prototype body-mounted MRI-compatible robot for needle guidance in shoulder arthrography

Reza Monfaredi; Reza Seifabadi; Iulian Iordachita; Raymond W. Sze; Nabile M. Safdar; Karun Sharma; Stanley T. Fricke; Axel Krieger; Kevin Cleary

A novel compact and lightweight patient-mounted MRI-compatible robot has been designed for MRI image-guided interventions. This robot is intended to enable MRI-guided needle placement as done in shoulder arthrography. The robot could make needle placement more accurate and simplify the current workflow by converting the traditional two-stage arthrography procedure (fluoroscopy-guided needle insertion followed by a diagnostic MRI scan) to a one-stage procedure (streamlined workflow all in MRI suite). The robot has 4 degrees of freedom (DOF), two for orientation of the needle and two for needle positioning. The mechanical design was based on several criteria including rigidity, MRI compatibility, compact design, sterilizability, and adjustability. The proposed workflow is discussed and initial MRI compatibility experiments are presented. The results show that artifacts in the region of interest are minimal and that MRI images of the shoulder were not adversely affected by placing the robot on a human volunteer.


IEEE-ASME Transactions on Mechatronics | 2017

Robotic System for MRI-Guided Focal Laser Ablation in the Prostate

Yue Chen; Alexander Squires; Reza Seifabadi; Sheng Xu; Harsh K. Agarwal; Marcelino Bernardo; Peter A. Pinto; Peter L. Choyke; Bradford J. Wood; Zion Tsz Ho Tse

MRI-conditional robotic platforms have proved to be an effective approach for image-guided interventions. In this study, a computer-assisted, pneumatically actuated robot is designed, built, and tested for MRI-guided prostate cancer focal laser ablation (FLA). The robotic manipulator provides two active planar degrees of freedom (DoFs) by using a customized CoreXY frame and one passive rotational DoF. A remote insertion mechanism improves the surgical workflow by keeping the patients inside the scanner during needle insertion. The robotic manipulator is tested in a 3T MR scanner to evaluate its MR compliance, and the results demonstrated that the signal-to-noise ratio (SNR) variation is less than 8%. The in-scanner template positioning accuracy test demonstrates that the manipulator achieves high targeting accuracy with a mean error of 0.46 mm and a standard deviation of 0.25 mm. Phantom studies have shown that the needle insertion accuracy of the manipulator is within 2 mm (mean = 1.7 mm, standard deviation = 0.2 mm).


IEEE Sensors Journal | 2017

Fiber-Optic Force Sensors for MRI-Guided Interventions and Rehabilitation: A Review

Hao Su; Iulian Iordachita; Junichi Tokuda; Nobuhiko Hata; Xuan Liu; Reza Seifabadi; Sheng Xu; Bradford J. Wood; Gregory S. Fischer

Magnetic resonance imaging (MRI) provides both anatomical imaging with excellent soft tissue contrast and functional MRI imaging (fMRI) of physiological parameters. The last two decades have witnessed the manifestation of increased interest in MRI-guided minimally invasive intervention procedures and fMRI for rehabilitation and neuroscience research. Accompanying the aspiration to utilize MRI to provide imaging feedback during interventions and brain activity for neuroscience study, there is an accumulated effort to utilize force sensors compatible with the MRI environment to meet the growing demand of these procedures, with the goal of enhanced interventional safety and accuracy, improved efficacy and rehabilitation outcome. This paper summarizes the fundamental principles, the state of the art development, and challenges of fiber-optic force sensors for MRI-guided interventions and rehabilitation. It provides an overview of MRI-compatible fiber-optic force sensors based on different sensing principles, including light intensity modulation, wavelength modulation, and phase modulation. Extensive design prototypes are reviewed to illustrate the detailed implementation of these principles. Advantages and disadvantages of the sensor designs are compared and analyzed. A perspective on the future development of fiber-optic sensors is also presented, which may have additional broad clinical applications. Future surgical interventions or rehabilitation will rely on intelligent force sensors to provide situational awareness to augment or complement human perception in these procedures.


The Journal of Urology | 2018

Comparison of Elastic and Rigid Registration during Magnetic Resonance Imaging/Ultrasound Fusion-Guided Prostate Biopsy: A Multi-Operator Phantom Study

Graham R. Hale; Marcin Czarniecki; Alexis Cheng; Jonathan Bloom; Reza Seifabadi; Samuel Gold; Kareem Rayn; Vikram K. Sabarwal; Sherif Mehralivand; Peter L. Choyke; Baris Turkbey; Brad J. Wood; Peter A. Pinto

Purpose The relative value of rigid or elastic registration during magnetic resonance imaging/ultrasound fusion guided prostate biopsy has been poorly studied. We compared registration errors (the distance between a region of interest and fiducial markers) between rigid and elastic registration during fusion guided prostate biopsy using a prostate phantom model. Materials and Methods Four gold fiducial markers visible on magnetic resonance imaging and ultrasound were placed throughout 1 phantom prostate model. The phantom underwent magnetic resonance imaging and the fiducial markers were labeled as regions of interest. An experienced user and a novice user of fusion guided prostate biopsy targeted regions of interest and then the corresponding fiducial markers on ultrasound after rigid and then elastic registration. Registration errors were compared. Results A total of 224 registration error measurements were recorded. Overall elastic registration did not provide significantly improved registration error over rigid registration (mean ± SD 4.87 ± 3.50 vs 4.11 ± 2.09 mm, p = 0.05). However, lesions near the edge of the phantom showed increased registration errors when using elastic registration (5.70 ± 3.43 vs 3.23 ± 1.68 mm, p = 0.03). Compared to the novice user the experienced user reported decreased registration error with rigid registration (3.25 ± 1.49 vs 4.98 ± 2.10 mm, p <0.01) and elastic registration (3.94 ± 2.61 vs 6.07 ± 4.16 mm, p <0.01). Conclusions We found no difference in registration errors between rigid and elastic registration overall but rigid registration decreased the registration error of targets near the prostate edge. Additionally, operator experience reduced registration errors regardless of the registration method. Therefore, elastic registration algorithms cannot serve as a replacement for attention to detail during the registration process and anatomical landmarks indicating accurate registration when beginning the procedure and before targeting each region of interest.


The Journal of Urology | 2017

MP52-13 TRANSPERINEAL MR-GUIDED PROSTATE NEEDLE INTERVENTIONS USING A PATIENT-SPECIFIC TEMPLATE

Dordaneh Sugano; Sheng Xu; Reza Seifabadi; Ivane Bakhutashvili; Neil Glossop; Peter L. Choyke; Peter A. Pinto; Reto Bale

cystoscopy. Patients with UTI, chronic pelvic pain, urethral strictures were excluded. Urinalysis/culture were collected before and 14 days after cystoscopy using Visera Elite System (Olympus) or Endosheath System (Cogentix). After cystoscopy they filled a Visual Analog Scale (VAS) for pain/discomfort. Physicians filled 5-point Likert scales for the following elements: ease of insertion, manipulation, optical quality and overall use. The reprocessing time, cost analysis, which means staff-cost associated with reprocessing & retail-price per system, were compared. RESULTS: Out of 74 patients enrolled, 40 completed the 2weeks study; 20 underwent cystoscopy with Visera Elite System and 20 with Endosheath. There were 2 positive cultures at 14 days follow up in the Visera System. VAS, physicians assessment and reprocessing time data are shown in Table 1. The cost analysis are presented in Tables 2 and 3. CONCLUSIONS: There was no increase in the risk of infection using the Endosheath System. It has a lower cost and reprocessing time. Regarding patient VAS and physician subjective assessment there was no difference in a comparative analysis.


Proceedings of SPIE | 2017

Enabling image fusion for a CT guided needle placement robot

Reza Seifabadi; Sheng Xu; Fereshteh Aalamifar; Gnanasekar Velusamy; Kaliyappan Puhazhendi; Bradford J. Wood

Purpose: This study presents development and integration of hardware and software that enables ultrasound (US) and computer tomography (CT) fusion for a FDA-approved CT-guided needle placement robot. Having real-time US image registered to a priori-taken intraoperative CT image provides more anatomic information during needle insertion, in order to target hard-to-see lesions or avoid critical structures invisible to CT, track target motion, and to better monitor ablation treatment zone in relation to the tumor location. Method: A passive encoded mechanical arm is developed for the robot in order to hold and track an abdominal US transducer. This 4 degrees of freedom (DOF) arm is designed to attach to the robot end-effector. The arm is locked by default and is released by a press of button. The arm is designed such that the needle is always in plane with US image. The articulated arm is calibrated to improve its accuracy. Custom designed software (OncoNav, NIH) was developed to fuse real-time US image to a priori-taken CT. Results: The accuracy of the end effector before and after passive arm calibration was 7.07mm ± 4.14mm and 1.74mm ±1.60mm, respectively. The accuracy of the US image to the arm calibration was 5mm. The feasibility of US-CT fusion using the proposed hardware and software was demonstrated in an abdominal commercial phantom. Conclusions: Calibration significantly improved the accuracy of the arm in US image tracking. Fusion of US to CT using the proposed hardware and software was feasible.


Proceedings of SPIE | 2017

Motorized fusion guided prostate biopsy: phantom study

Reza Seifabadi; Sheng Xu; Fereshteh Aalamifar; Peter A. Pinto; Bradford J. Wood

Purpose: Fusion of Magnetic Resonance Imaging (MRI) with intraoperative real-time Ultrasound (US) during prostate biopsy has significantly improved the sensitivity of transrectal ultrasound (TRUS) guided cancer detection. Currently, sweeping of the TRUS probe to build a 3D volume as part of the fusion process and the TRUS probe manipulation for needle guidance are both done manually. A motorized, joystick controlled, probe holder was custom fabricated that can potentially reduce inter-operator variability, provide standardization of needle placement, improve repeatability and uniformity of needle placement, which may have impacts upon the learning curve after clinical deployment of this emerging approach. Method: a 2DOF motorized probe holder was designed to provide translation and rotation of a triplane TRUS end firing probe for prostate biopsy. The probe holder was joystick controlled and can assist manipulation of the probe during needle insertion as well as in acquiring a smoother US 2D to 3D sweep in which the 3D US volume for fusion is built. A commercial MRI-US fusion platform was used. Three targets were specified on MR image of a commercial prostate phantom. After performing the registration, two operators performed targeting, once manually and once with the assistance of the motorized probe holder. They repeated these tasks 5 times resulting in a total of 30 targeting events. Time of completion and mechanical error i.e. distance of the target from the needle trajectory in the software user interface were measured. Repeatability in reaching a given target in a systematic and consistent way was measured using a scatter plot showing all targets in the US coordinate system. Pearson product-moment correlation coefficient (PPMCC) was used to demonstrate the probe steadiness during targeting. Results: the completion time was 25±17 sec, 25±24 sec, and 27±15 sec for free hand and 24±10 sec, 22.5±10 sec, and 37±10 sec for motorized insertion, for target 1, 2, and 3, respectively. The mechanical error was 0.75±0.4 mm, 0.45±0.4 mm, and 0.55±0.4 mm, for free hand approach while it was 1.0±0.57 mm, 0.45±0.4 mm, and 0.35±0.25 mm, for motorized approach, for target 1, 2, and 3, respectively. PPMCC remained almost at 1.0 for the motorized approach while having a variation between 0.9 and 1.0 for the free hand approach. Conclusions: motorized fusion guided prostate biopsy in a phantom study was feasible and non-inferior or comparable to the free hand manual approach in terms of accuracy and speed of targeting, while being superior in terms of repeatability and steadiness.


medical image computing and computer assisted intervention | 2016

Ultrasound Tomosynthesis: A New Paradigm for Quantitative Imaging of the Prostate

Fereshteh Aalamifar; Reza Seifabadi; Marcelino Bernardo; Ayele H. Negussie; Baris Turkbey; Maria J. Merino; Peter A. Pinto; Arman Rahmim; Bradford J. Wood; Emad M. Boctor

Biopsy under B-mode transrectal ultrasound (TRUS) is the gold standard for prostate cancer diagnosis. However, B-mode US shows only the boundary of the prostate, therefore biopsy is performed in a blind fashion, resulting in many false negatives. Although MRI or TRUS-MRI fusion is more sensitive and specific, it may not be readily available across a broad population, and may be cost prohibitive. In this paper, a limited-angle transmission US methodology is proposed, here called US tomosynthesis (USTS), for prostate imaging. This enables quantitative imaging of the prostate, such as generation of a speed of sound (SOS) map, which theoretically may improve detection, localization, or characterization of cancerous prostate tissue. Prostate USTS can be enabled by adding an abdominal probe aligned with the transrectal probe by utilizing a robotic arm. In this paper, we elaborate proposed methodology; then develop a setup and a technique to enable ex vivo USTS imaging of human prostate immediately after prostatectomy. Custom hardware and software were developed and implemented. Mock ex vivo prostate and lesions were made by filling a mold cavity with water, and adding a plastisol lesion. The time of flights were picked using a proposed center of mass method and corrected manually. The SOS map with a difference expectation-maximization reconstruction performed most accurately, with 2.69 %, 0.23 %, 0.06 % bias in estimating the SOS of plastisol, water, and mold respectively. Although USTS methodology requires further ex vivo validation, USTS has the potential to open up a new window in quantitative low-cost US imaging of the prostate which may meet a public health need.


Proceedings of SPIE | 2016

A motorized ultrasound system for MRI-ultrasound fusion guided prostatectomy

Reza Seifabadi; Sheng Xu; Peter A. Pinto; Bradford J. Wood

Purpose: This study presents MoTRUS, a motorized transrectal ultrasound system, to enable remote navigation of a transrectal ultrasound (TRUS) probe during da Vinci assisted prostatectomy. MoTRUS not only provides a stable platform to the ultrasound probe, but also allows the physician to navigate it remotely while sitting on the da Vinci console. This study also presents phantom feasibility study with the goal being intraoperative MRI-US image fusion capability to bring preoperative MR images to the operating room for the best visualization of the gland, boundaries, nerves, etc. Method: A two degree-of-freedom probe holder is developed to insert and rotate a bi-plane transrectal ultrasound transducer. A custom joystick is made to enable remote navigation of MoTRUS. Safety features have been considered to avoid inadvertent risks (if any) to the patient. Custom design software has been developed to fuse pre-operative MR images to intraoperative ultrasound images acquired by MoTRUS. Results: Remote TRUS probe navigation was evaluated on a patient after taking required consents during prostatectomy using MoTRUS. It took 10 min to setup the system in OR. MoTRUS provided similar capability in addition to remote navigation and stable imaging. No complications were observed. Image fusion was evaluated on a commercial prostate phantom. Electromagnetic tracking was used for the fusion. Conclusions: Motorized navigation of the TRUS probe during prostatectomy is safe and feasible. Remote navigation provides physician with a more precise and easier control of the ultrasound image while removing the burden of manual manipulation of the probe. Image fusion improved visualization of the prostate and boundaries in a phantom study.

Collaboration


Dive into the Reza Seifabadi's collaboration.

Top Co-Authors

Avatar

Bradford J. Wood

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Sheng Xu

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Peter A. Pinto

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Ayele H. Negussie

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Peter L. Choyke

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

L. Jiang

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Venkatesh Krishnasamy

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Baris Turkbey

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge