Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Fred S. Azar is active.

Publication


Featured researches published by Fred S. Azar.


Medical Physics | 2008

Comparison of diffuse optical tomography of human breast with whole‐body and breast‐only positron emission tomography

Soren D. Konecky; Regine Choe; Alper Corlu; Kijoon Lee; R. I. Wiener; Shyam Srinivas; Janet Saffer; Richard Freifelder; Joel S. Karp; Nassim Hajjioui; Fred S. Azar; Arjun G. Yodh

We acquire and compare three-dimensional tomographic breast images of three females with suspicious masses using diffuse optical tomography (DOT) and positron emission tomography (PET). Co-registration of DOT and PET images was facilitated by a mutual information maximization algorithm. We also compared DOT and whole-body PET images of 14 patients with breast abnormalities. Positive correlations were found between total hemoglobin concentration and tissue scattering measured by DOT, and fluorodeoxyglucose (18F-FDG) uptake. In light of these observations, we suggest potential benefits of combining both PET and DOT for characterization of breast lesions.


Journal of Biomedical Optics | 2007

Standardized platform for coregistration of nonconcurrent diffuse optical and magnetic resonance breast images obtained in different geometries

Fred S. Azar; Kijoon Lee; Ali Khamene; Regine Choe; Alper Corlu; Soren D. Konecky; Frank Sauer; Arjun G. Yodh

We present a novel methodology for combining breast image data obtained at different times, in different geometries, and by different techniques. We combine data based on diffuse optical tomography (DOT) and magnetic resonance imaging (MRI). The software platform integrates advanced multimodal registration and segmentation algorithms, requires minimal user experience, and employs computationally efficient techniques. The resulting superposed 3-D tomographs facilitate tissue analyses based on structural and functional data derived from both modalities, and readily permit enhancement of DOT data reconstruction using MRI-derived a-priori structural information. We demonstrate the multimodal registration method using a simulated phantom, and we present initial patient studies that confirm that tumorous regions in a patient breast found by both imaging modalities exhibit significantly higher total hemoglobin concentration (THC) than surrounding normal tissues. The average THC in the tumorous regions is one to three standard deviations larger than the overall breast average THC for all patients.


medicine meets virtual reality | 2003

An Augmented Reality system for MRI-guided needle biopsies.

Ali Khamene; Frank K. Wacker; Sebastian Vogt; Fred S. Azar; Michael Wendt; Frank Sauer; Jonathan S. Lewin

A navigation system can increase the speed and accuracy of MR guided interventions that make use of scanners with high-field closed magnets. We report on first needle placement experiments performed with an Augmented Reality (AR) navigation system. AR visualization provides very intuitive guidance, resulting in a faster procedure. The accuracy of the needle placement depends on the registration accuracy of the system. In the present trials, the needle was placed as good as 1mm close to the target center, however in a small number of cases substantially larger errors occurred and were most likely caused by needle bending.


Medical Imaging 2005: Visualization, Image-Guided Procedures, and Display | 2005

Registration of high-resolution 3D atrial images with electroanatomical cardiac mapping: evaluation of registration methodology

Yiyong Sun; Fred S. Azar; Chenyang Xu; Gal Hayam; Assaf Preiss; Norbert Rahn; Frank Sauer

Registration of atrial high-resolution CT and MR images with a cardiac mapping system can provide real-time electrical activation information, catheter tracking, and recording of lesion position. The cardiac mapping and navigation system comprises a miniature passive magnetic field sensor, an external ultralow magnetic field emitter (location pad), and a processing unit (CARTO, BiosenseWebster). We developed a progressive methodology for both interactively and automatically registering high-resolution 3D atrial images (MR or CT) with the corresponding electrophysiological (EP) points of 3D electro-anatomical (EA) maps. This methodology consists of four types of registration algorithms ranging from landmark-based to surface-based registration. We evaluated the methodology through phantom and patient studies. In the phantom study, we obtain a CT scan of a transparent heart phantom, and then use the CARTO system to visually pick a number of points inside the transparent phantom. After segmenting the atrium into a 3D surface, we register it to the measured EA map. The results are compared to the manual EA point measurements. In the 13-patient study, the four types of registrations are evaluated: visual alignment, landmark registration (three EA points are used), surface-based registration (all EA points are used), and local surface-based registration (a subset of the EA points is used, and one specific point is given a higher weight for a better “local registration”). Surface-based registration proves to be clearly superior to visual alignment. This new registration methodology may help in creating a novel and more visually interactive workflow for EP procedures, with more accurate EA map acquisitions. This may improve the ablation accuracy in atrial fibrillation (AFib) procedures, decrease the dependency on fluoroscopy, and also lead to less radiation delivered to the patient.


international conference on data engineering | 2006

Experiment Management with Metadata-based Integration for Collaborative Scientific Research

Fusheng Wang; Peiya Liu; John Pearson; Fred S. Azar; Gerald Madlmayr

Scientific research in many fields is increasingly a collaborative effort across multiple institutions and disciplines. Scientific researchers need not only an effective system to manage their data, results, and the experiments that generate the results, but also a platform to integrate, share and search these across multiple institutions. Therefore, researchers are able to reuse experiments, pool expertise and validate approaches. In this paper, we present Sci- Port, a system of experiment management and integration for collaborative scientific research. SciPort’s architecture uses i) a general transformation-based data model to represent and link experiment processes; ii) hierarchical data classification across multiple institutions according to research programs’ goals and organization; iii) metadatacentric representation that concisely captures the context of experiments; and iv) virtual data integration through centralized metadata integration. The system is built for open source, and the metadata-based representation and integration provides a unified framework and tool set to manage and share experiments for scientific research communities.


Medical Imaging 2004: Visualization, Image-Guided Procedures, and Display | 2004

User performance analysis of different image-based navigation systems for needle placement procedures

Fred S. Azar; Nathalie Perrin; Ali Khamene; Sebastian Vogt; Frank Sauer

We present a user performance analysis of four navigation systems based on different visualization schemes (2D, 3D, stereoscopy on a monitor, and a stereo head mounted display (HMD)). We developed a well-defined user workflow, which starts with the selection of a safe and efficient needle path, followed by the placement, insertion and removal of the needle. We performed the needle procedure on a foam-based phantom, targeting a virtual lesion while avoiding virtual critical structures. The phantom and needle’s position and orientation were optically tracked in real-time. 28 users performed each a total of 20 needle placements, on five phantom configurations using the four visualization schemes. Based on digital measurements, and on qualitative user surveys, we computed the following parameters: accuracy and duration of the procedure, user progress, efficiency, confidence, and judgment. The results show that all systems are about equivalent when it comes to reaching the center of the target. However the HMD- and 2D- based systems performed better in avoiding the surrounding structures. The needle procedures were performed in a shorter amount of time using the HMD- and 3D- based systems. With appropriate user training, procedure time for the 2D- based system decreased significantly.


medical image computing and computer assisted intervention | 2003

Local 3D Reconstruction and Augmented Reality Visualization of Free-Hand Ultrasound for Needle Biopsy Procedures

Ali Khamene; Sebastian Vogt; Fred S. Azar; Tobias Sielhorst; Frank Sauer; Heinrich Niemann

We have developed a 3D ultrasound augmented reality (AR) workspace, in which the user wears a video-see-through head-mounted display to observe both 2D ultrasound (US) images and local 3D ultrasound volumes in-situ, i.e. in their actual location, overlaid onto a stereoscopic video view. In this unified platform, the user has the ability to reconstruct a local three-dimensional volume from a series of scanned two-dimensional US B-planes. In-situ visualization of rendered three-dimensional volumes facilitates the understanding of the object’s three-dimensional structure. The user has full control over acquisition, reconstruction, and visualization of the volumetric data through an interactive, intuitive augmented reality user interface. Furthermore, the system has the ability to track a hand-held tool (e.g. biopsy needle). Graphically enhanced needle improves the perception of needle’s position and orientation with respect to the lesion target in the rendered volume. In phantom experiments, our AR system has proven to effectively facilitate ultrasound guided needle biopsies. We have also verified biopsy results by reconstructing the “post-operative” volumes with the needle left in place.


international conference on computer vision | 2007

A Unified and Efficient Approach for Free-form Deformable Registration

Ali Khamene; Fred S. Azar; Loren Arthur Schwarz; Darko Zikic; Nassir Navab; Eike Rietzel

We propose a novel numerical approach for solving the free-form deformable registration problem. The central idea is to utilize the well understood techniques from variational deformable registration problems. We demonstrate that it is possible to formulate the free-form deformable registration problem as the optimization of an energy functional as in the dense deformation case. This energy functional possesses image distance and regularization terms, which are both functions of the free-form deformation control points. We then setup a semi-backward (implicit) partial differential equation that optimizes the established energy functional. In addition to being mathematically justified, this approach provides both accuracy and speed. Our evaluation on synthetic, real, two dimensional, and three dimensional data demonstrates accuracy and computational effectiveness.


Optical Methods in Drug Discovery and Development | 2005

Multimodal information integration and visualization: optical imaging and MRI

Fred S. Azar; Mazen ElBawab; Ali Khamene; Kijoon Lee; Regine Choe; Alper Corlu; Soren D. Konecky; Arjun G. Yodh; Frank Sauer

We have developed a software platform for multimodal integration and visualization of diffuse optical tomography (DOT) and magnetic resonance imaging (MRI) of breast cancer. The image visualization platform allows multimodality 3D image visualization and manipulation of datasets, such as a variety of 3D rendering technique, and the ability to simultaneously control multiple fields of view. This platform enables quantitative and qualitative analysis of structural and functional diagnostic data, using both conventional & molecular imaging. The functional parameters, together with morphological parameters from MR can be suitably combined and correlated to the absolute diagnosis from histopathology. Fusion of the multimodal datasets will eventually lead to a significant improvement in the sensitivity and specificity of breast cancer detection. Fusion may also allow a priori structural information derived from MRI to be incorporated into the reconstruction of diffuse optical tomography images. We will present the early results of image visualization and registration on multimodal breast cancer data, DOT and MRI.


Proceedings of SPIE | 2007

Joint analysis of non-concurrent magnetic resonance imaging and diffuse optical tomography of breast cancer

Fred S. Azar; Kijoon Lee; Regine Choe; Alper Corlu; Soren D. Konecky; Arjun G. Yodh

We have developed a novel method for combining non-concurrent MR and DOT data, which integrates advanced multimodal registration and segmentation algorithms within a well-defined workflow. The method requires little user interaction, is computationally efficient for practical applications, and enables joint MR/DOT analysis. The method presents additional advantages: More flexibility than integrated MR/DOT imaging systems, The ability to independently develop a standalone DOT system without the stringent limitations imposed by the MRI device environment, Enhancement of sensitivity and specificity for breast tumor detection, Combined analysis of structural and functional data, Enhancement of DOT data reconstruction through the use of MR-derived a priori structural information. We have conducted an initial patient study which asks an important question: how can functional information on a tumor obtained from DOT data be combined with the anatomy of that tumor derived from MRI data? The study confirms that tumor areas in the patient breasts exhibit significantly higher total hemoglobin concentration (THC) than their surroundings. The results show significance in intra-patient THC variations, and justify the use of our normalized difference measure defined as the distance from the average THC inside the breast, to the average THC inside the tumor volume in terms of the THC standard deviation inside the breast. This method contributes to the long-term goal of enabling standardized direct comparison of MRI and DOT and facilitating validation of DOT imaging methods in clinical studies.

Collaboration


Dive into the Fred S. Azar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xavier Intes

Rensselaer Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar

Arjun G. Yodh

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Kijoon Lee

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alper Corlu

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Soren D. Konecky

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Vida Kianzad

Beth Israel Deaconess Medical Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge