Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where J Alaly is active.

Publication


Featured researches published by J Alaly.


Physics in Medicine and Biology | 2006

Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose–volume outcome relationships

I. El Naqa; Gita Suneja; P.E. Lindsay; A Hope; J Alaly; Milos Vicic; Jeffrey D. Bradley; A Apte; Joseph O. Deasy

Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearmans rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.


Acta Oncologica | 2010

Datamining approaches for modeling tumor control probability

Issam El Naqa; Joseph O. Deasy; Yi Mu; Ellen Huang; Andrew Hope; Patricia Lindsay; A Apte; J Alaly; Jeffrey D. Bradley

Abstract Background. Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Material and methods. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Results. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). Conclusions. The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.


Medical Physics | 2006

TU‐D‐224C‐05: Validation of a Linear Accelerator Source Model and Commissioning Process for Routine Clinical Monte Carlo Calculations

J Cui; K Zakaryan; J Alaly; Milos Vicic; M Wiesmeyer; Joseph O. Deasy

Purpose: Many source models for Monte Carlotreatment planning include parameters which are difficult or ambiguous to determine. We developed and tested a straightforward source model and commissioning process for clinical Monte Carlo calculations. Method and Materials: Our commissioning process of fitting treatment planning system data includes fitting the photon spectrum, electron contamination, penumbra fluence blurring, jaw leakage, and the flattening filter effect. The energy spectrum is fit using a modified Fatigue Life Distribution multiplied by a Fermi. Electron contamination is modeled separately an exponential function, as suggested by Fippel. Penumbral blurring is modeled using a Gaussian filter. Leakage radiation is modeled as a low‐intensity wide‐field monoenergetic source. The flattening filter effect is modeled by multiplying the optimized fluence by a Gaussian reduction. The penumbra and the flattening filter are applied to the fluence map. We tested our methodology on doses produced by a Varian accelerator for 6 MV and 18 MV photons and 5×5, 10×10, and 20×20 cm2field sizes.Results: We found that nine published photon spectra of Varian, Eleckta, and Siemens linear accelerators, ranging in energy from 4 MV to 25 MV could be modeled by the Fatigue Life Distribution with a Fermi cutoff. The agreement between the TPS doses and the commissioned MCdoses were within 2%. Off‐axis energy spectrum softening was unneeded. Conclusion: We have developed a straightforward, yet flexible source modeling system. The commissioning process affords a high‐degree of automation with an unambiguous determination of the relevant parameters. Commissioning of clinical Monte Carlotreatment planning systems is facilitated by using a source model which is only as complicated as necessary to accurately simulates dose distributions. Conflict of Interest: This research was partially supported by NIH grant R01 CA90445 and a grant from Sun Nuclear, Corp.


Medical Physics | 2006

SU‐FF‐T‐345: On the Suitability of Radiographic Film for Low Density Material Dosimetry and Photon Algorithm Verification

M Wiesmeyer; J Cui; K Zakaryan; J Alaly; E Klein; Sasa Mutic; D Low; Joseph O. Deasy

Purpose: To assess potential errors in radiographic film dosimetry in low density materials and to compare film measurements to dose estimates of a commercial convolution/superposition photon (CSP) dose calculation algorithm. Method and Materials: A standard film phantom was modified by replacing water‐equivalent slabs (30 HU) in its central portion with very low‐density material (−960 HU) to produce a lung slab phantom. Experiments were performed irradiating this phantom with 6 and 18 MV photons and field sizes of 2×2, 5×5, and 10×10 cm with 13 films placed between slabs. With unprocessed film in place, the phantom was then imaged in a computed tomographyscanner and Monte Carlo (MC) and CSP calculations were done for each field size and energy combination. The phantom was then rescanned without film and dose was recalculated using MC to estimate the effect of the film in the prior MC calculations. Results: Measurements and MC calculations demonstrated field size and energy‐dependent dose perturbations at film planes in the low density material (up to 20% of maximum dose). In the phantom with film, central axis measurements and MC calculations matched within about 3%. The CSP algorithm was not perturbed by the film and overestimated dose in the low density region. Relying on film measurements alone would indicate a maximum overestimate of about 17% for 6 MV beams and 30% for 18 MV beams for the 2×2 cm fields. The filmless MC calculations show the true error to be about 6–9% higher. Conclusion: The error in CSP calculations will be underestimated if film is used as a dosimeter in very low‐density materials. The use of somewhat denser lung‐equivalent materials (e.g., −700 HU) would likely result in reduced, but still significant, error estimates. Supported by NIH grant RO1 CA85181 and a grant from Sun Nuclear, Corp.


Medical Physics | 2006

TU‐D‐ValA‐07: Prioritized Prescription‐Goal Treatment Planning for IMRT: The Effect of Constraint Moderation

Jan J. Wilkens; J Alaly; Joseph O. Deasy

Purpose: Determining the ‘best’ optimization parameters in IMRT planning is typically a time‐consuming trial‐and‐error process with no unambiguous termination point. Recently we and others proposed using a goal‐programming approach which better captures the prioritization of clinical goals without introducing ambiguous user‐input parameters. We consider here the effect of adding ‘slip’ to this method, which allows for slight degradations in metric performance compared to maximum achievable. Method and Materials: In the first phase of the optimization process, only the highest‐order goals are considered (target coverage and dose‐limiting normal structures). In subsequent phases (levels), the achievements of the previous step are turned into hard constraints and lower‐order goals are optimized subject to these constraints. Slip factors were introduced to allow for slight violations of the constraints. Linear as well as quadratic goal terms were evaluated for performance as well as dosimetric ‘steerability.’ The resulting constraints can also be expressed as linear or quadratic equations.Results: Focusing on head and neck cases, we present several examples of treatment plans using prioritized optimization. These are compared to conventional IMRT plans in terms of dosimetric properties and optimization speed. The main advantages of the new optimization method are (1) its ability to generate plans that meet the clinical goals/prescriptions without tuning any weighting factors or dose‐volume constraints, and (2) the ability to conveniently include more terms which represent elements such as beam weight smoothness. Lower level goals can be optimized to the achievable limit without compromising higher order goals. Modest slip factors improved overall performance. Conclusion: The prioritized prescription‐goal planning method including slip factors allows for a more intuitive and human‐time‐efficient way of dealing with conflicting goals compared to the conventional trial‐and‐error method of varying weighting factors. This research was supported by a NIH grant R01 CA90445 and a grant from TomoTherapy, Inc.


Medical Physics | 2006

SU‐FF‐T‐358: PlanCheck: A System for Routine Clinical Comparison of IMRT Treatment Plans with Monte Carlo Recalculations

K Zakaryan; J Cui; J Alaly; I. El Naqa; J Simon; W Simon; D Low; Joseph O. Deasy

Purpose: Current IMRT QA methods are cumbersome and are not comprehensive. The purpose of the PlanCheck dose recalculation system is to provide independent verification that MLC leaf‐sequences generated by commercial treatment planning system will result in an acceptable dose. Method and Materials: The PlanCheck Beam Commissioning process was developed for medicallinear accelerators and includes modeling of the photon energy spectra, off‐axis softening, electron contamination, flattening filter and penumbra blurring. The Monte Carlo beam parameters are derived by fitting treatment planning dose in water and to the measured dose. The system will regenerate the dose for each treatment and for the whole planned dose utilizing the Monte Carlo engine based on beam sequence DICOM/RTOG information imported into PlanCheck. The comparison metrics, including dose‐volume histogram comparisons, report the validation quality and dose agreement. Results:Monte Carlo commissioning was tested for Varian Linear Accelerator (Clinac 2100) for 2×2 cm2, 5×5 cm2, 10×10 cm2 and 20×20 cm2 open fields in water for 6MV, 10MV and 18MV photon beam. The profiles and comparison results show good agreement for Eclipse (Varian) open field dose in water. The IMRT treatment plans from systems such as XiO (CMS), Eclipse (Varian) and Pinnacle (Phillips) were tested with Plancheck for dose agreement with Monte Carlo Dose and found to show adequate agreement. Conclusion: PlanCheck Monte Carlo calculations shows good overall agreement with treatment planning results except for regions with complex heterogeneities. Sun Nuclear Corporation is currently developing this product for intensive commercial use. The system dose engine is currently in process of integration with a 64‐bit/16‐node calculation cluster, which we expect will make the typical IMRT plan calculation time 30 minutes or less for the total planned 3D dose. This research was partially supported by NIH grant R01 CA90445 and a grant from Sun Nuclear, Corp.


Medical Physics | 2005

MO‐D‐T‐6E‐02: Dose‐Response Explorer: An Open‐Source‐Code Matlab‐Based Tool for Modeling Treatment Outcome as a Function of Predictive Factors

Gita Suneja; I. El Naqa; J Alaly; P.E. Lindsay; A Hope; Joseph O. Deasy

Purpose:Radiotherapy treatment outcome models are a complicated function of treatment parameters and/or clinical factors. Our objective is to provide clinicians and scientists with an accurate, flexible, and user‐friendly tool to explore radiotherapy outcome models with different factors leading to tumorcontrol or normal tissue complications. We refer to this tool as the dose response explorer (DREX). Method and Materials: DREX, based on Matlab named‐field structures, provides tools for multi‐term logistic regression modeling, correlation calculations, and graphical comparisons between model predictions and observations. A GUI‐driven interface was constructed using Matlab tools. Named‐field structures in Matlab support development of very human‐readable databasesResults: The DREX tool provides the NTCP or TCP analyst with multiple features which include: (1) Combination of multiple dose‐volume variables (mean dose, max dose, Vx (percentage volume receiving × Gy), Dx (dose to × percentage volume), EUD (equivalent uniform dose), etc) and clinical factors (age, gender, ethnicity, etc), (2) Modelanalysis using logistic regression, (3) Performance assessment using Spearmans rank correlation and receiver operating characteristics (ROC) curves, and (4) Graphical capability to visualize NTCP or TCP prediction versus selected variable model using contour and histogram plots. DREX has been in constant use in our research group for the last nine months. Conclusion: We developed user‐friendly software to explore and modelradiotherapydose‐response correlations. DREX facilitates convenient study of different treatment and clinical factors which may correlate with complication or control. We believe that the DREX software combined with CERR archiving would provide the clinical researcher with convenient tools to accrue and modelradiotherapy outcomes data. DREX will be freely distributed via the web. We expect to continue developing DREX, including adding methods to automatically select model terms, find the optimal model size, and estimate parameter uncertainties.


Medical Physics | 2005

TU‐C‐T‐617‐06: Importance of Pre‐Fraction Helical CT Isocenter Verification in Extracranial Stereotactic Radiosurgery

A Hope; J Alaly; J Liu; Joseph O. Deasy; Jeffrey D. Bradley; Robert E. Drzymala

Purpose: To quantify the impact of pre‐fraction helical CT isocenter verification vs. setup based on planning CT in fractionated extracranial stereotactic radiosurgery.Method and Materials:Treatment plans (Elekta PrecisePlan) and pre‐fraction isocenter verification helical CT scans for 12 patients (40 fractions) were recovered from treatment plan archives. All structures were contoured by a single physician at the time of treatment. Each plan was imported into a customizable treatment plan analysis suite (CERR). Using CERR, pre‐fraction isocenter verification CT scans were fused with the original treatment plan using the external body frame as a reference. The original planned dose distribution was then translated from original treatment plan isocenter to pre‐fraction verification isocenters in each fraction. Dose and volume parameters for pertinent structures were automatically extracted using both registration methods (planned or pre‐fraction scans) for the original treatment plan and for all subsequent fractions. All patients were treated using the pre‐fraction verified isocenter rather than pre‐calculated body frame fiducials as per our institutional policies. Results: GTV volumes on pre‐fraction CTs varied from original planned GTV volume (64%–203%, mean=101.8+/−26.5%) largely due to helical sampling of a mobile target. Using the external body frame as the only setup reference would have resulted in geographic misses (<80% coverage of 95% of GTV) in 7/40 (17.5%) fractions. Pre‐fraction isocenter verification resulted in improved D95 GTV coverage (88–102%, mean=99.3% +/−2.4%) with no geographic misses. Conclusion: The current RTOG protocol (0236) evaluating extracranial stereotactic radiosurgery does not require pre‐fraction CTtumor position verification. Our institutional policy is to verify isocenter/tumor position prior to each fraction via CT. Although helical scanning artifacts are present, pre‐fraction CT‐based isocenter verification may provide more consistent tumor coverage than setup to planned body frame fiducials. Conflict of Interest: Support for this research was provided in part by Elekta, Inc.


Medical Physics | 2005

SU‐CC‐J‐6C‐08: Automated Beam Direction Selection for IMRT Based On Geometrical Concepts of Viewability and Orthogonality

V Clark; J Alaly; K Zakarian; Joseph O. Deasy

Purpose: To develop a fast and reliable algorithm to automatically select beam angles for intensity‐modulated radiation therapy(IMRT)treatment planning. We hypothesized that such an algorithm could be developed based on purely geometrical concepts as a pre‐planning step. Method and Materials: We define any point in the target as viewable by a given beam if a ray from that beam, which passes through the point, does not intersect any critical structures (i.e., the point can be ‘viewed’ by that beam). The beam angle selection problem can then be generalized to the multi‐set cover problem, a difficult (NP‐hard) computer science problem. We also consider orthogonality, which captures beam non‐overlap outside the target. We hypothesize that increasing 3‐viewability (fraction of points viewable by at least 3 beams) and increasing orthogonality typically results in higher quality IMRTdose distributions. This was tested by comparing, for 30 random sets of 5 beam angles, the objective function based on 3‐viewability and orthogonality vs. the final objective function value output of a weighted quadratic IMRT optimization. Our beam angle selection algorithm extends a greedy set cover algorithm and aims to find a set of coplanar angles that will make a maximum fraction of target points at least k‐viewable while also maximizing orthogonality. Results: For an IMRT case where beam angle selection impacted IMRT objective function values, there was a strong correlation between the viewability‐orthogonality objective function value and the final objective function value of a weighted quadratic IMRT optimization (Spearman correlation coefficient 0.63 (p<0.0004)). Our proposed algorithm determines beam angles for a typical plan in 2–4 minutes on a 2 GHz PC. Conclusion: We conclude that the purely geometrical concepts of viewability and orthogonality can be used as a basis for automatically and efficiently selecting IMRT beam angles using the class of algorithms proposed.


Medical Physics | 2005

SU-FF-T-320: Simple Acoustic Beam Model for Thermoradiotherapy Implemented in An Open Source Treatment Planning Research System

K Zakarian; J Alaly; Petr Novák; Joseph O. Deasy; Eduardo G. Moros

Purpose: Several clinical trials have shown that hyperthermia can significantly increase both local tumorcontrol rates and duration of local control, significantly improving the quality of radiation treatment. No existing treatment planning systems provide tools for planning and analyzing thermal and radiationdoses simultaneously in the same volume of interest. We propose to modify our Matlab‐based open‐source‐code system CERR (Computational Environment for Radiotherapy Research) for the development of thermoradiotherapy treatment planning (radium.wustl.edu/cerr). Method and Materials: We implemented a simple exponentially attenuated acoustic beam model in CERR, accounting for reflection, transmission and refraction of the primary beams. The model applies to sub‐volumes that are assumed “homogeneous” (air, soft tissue, bone), that make up a composite “heterogeneous” total computational volume, and that account for interface phenomena, i.e. reflection, transmission and refraction of the primary beams at impedance mismatched interfaces between sub‐volumes. Results: We calculated the SAR (specific absorption rate) for a single acoustic beam at 3.5 MHz for a chest wall breast plan. The field size was 12cm × 12cm. Calculation of a single beam takes approximately 60 seconds for plan size of 512×512×131 voxels. The power deposition of this beam for the CERR plan is shown. An attenuation profile for the beam is shown. The model correctly shows zero SAR values outside the beam and in the lung areas. Conclusion: We propose tools to display SAR, temperatures, thermal doses, hybrid thermo‐radio‐ therapy doses, etc., simultaneously, along with calculation of volume histograms for the various dose parameters. Significant advances in clinical thermoradiotherapy have been hampered by the lack of advanced treatment planning systems. We are embarking on a long‐term project to develop a CERR‐based system for superficial ultrasound hyperthermia that includes implementation and validation of complex acousto‐thermal numerical models. The system will be freely distributed to the hyperthermia research community for IRB‐approved research.

Collaboration


Dive into the J Alaly's collaboration.

Top Co-Authors

Avatar

Joseph O. Deasy

Memorial Sloan Kettering Cancer Center

View shared research outputs
Top Co-Authors

Avatar

A Hope

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

I. El Naqa

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

K Zakarian

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Jeffrey D. Bradley

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

P.E. Lindsay

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

A Apte

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

D Low

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

J Cui

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Milos Vicic

Washington University in St. Louis

View shared research outputs
Researchain Logo
Decentralizing Knowledge