Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Neal B. Gallagher is active.

Publication


Featured researches published by Neal B. Gallagher.


Journal of Process Control | 1996

The process chemometrics approach to process monitoring and fault detection

Barry M. Wise; Neal B. Gallagher

Chemometrics, the application of mathematical and statistical methods to the analysis of chemical data, is finding ever widening applications in the chemical process environment. This article reviews the chemometrics approach to chemical process monitoring and fault detection. These approaches rely on the formation of a mathematical/statistical model that is based on historical process data. New process data can then be compared with models of normal operation in order to detect a change in the system. Typical modelling approaches rely on principal components analysis, partial least squares and a variety of other chemometric methods. Applications where the ordered nature of the data is taken into account explicitly are also beginning to see use. This article reviews the state-of-the-art of process chemometrics and current trends in research and applications.


Journal of Chemometrics | 1999

A comparison of principal component analysis, multiway principal component analysis, trilinear decomposition and parallel factor analysis for fault detection in a semiconductor etch process

Barry M. Wise; Neal B. Gallagher; Stephanie Watts Butler; Daniel D. White; Gabriel G. Barna

Multivariate statistical process control (MSPC) tools have been developed for monitoring a Lam 9600 TCP metal etcher at Texas Instruments. These tools are used to determine if the etch process is operating normally or if a system fault has occurred. Application of these methods is complicated because the etch process data exhibit a large amount of normal systematic variation. Variations due to faults of process concern can be relatively minor in comparison. The Lam 9600 used in this study is equipped with several sensor systems including engineering variables (e.g. pressure, gas flow rates and power), spatially resolved optical emission spectroscopy (OES) of the plasma and a radio‐frequency monitoring (RFM) system to monitor the power and phase relationships of the plasma generator. A variety of analysis methods and data preprocessing techniques have been tested for their sensitivity to specific system faults. These methods have been applied to data from each of the sensor systems separately and in combination. The performance of the methods on a set of benchmark fault detection problems is presented and the strengths and weaknesses of the methods are discussed, along with the relative advantages of each of the sensor systems. Copyright


IFAC Proceedings Volumes | 1997

Development and Benchmarking of Multivariate Statistical Process Control Tools for a Semiconductor Etch Process: Improving Robustness through Model Updating

Neal B. Gallagher; Barry M. Wise; Stephanie Watts Butler; Daniel D. White; Gabriel G. Barna

Abstract Multivariate Statistical Process Control tools have been developed for monitoring and fault detection on a Lam 9600 Metal Etcher. Application of these methods is complicated because the process data exhibits large amounts of normal variation that is continuous on some time scales and discontinuous on others. Variations due to faults can be minor in comparison. Several models based on principal components analysis and variants which incorporate methods for model updating have been tested for long term robustness and sensitivity to known faults. Model performance was assessed with about six month’s worth of process data and a set of benchmark fault detection problems.


Computers & Chemical Engineering | 1996

Application of multi-way principal components analysis to nuclear waste storage tank monitoring

Neal B. Gallagher; Barry M. Wise; Charles W. Stewart

Multivariate statistical process control (MSPC) techniques for batch processes have been extended to monitoring a semi-batch process by focussing on periodic process set point changes. This procedure can be extended to continuous processes that have repeated upsets or perturbations. The MSPC technique was demonstrated for a nuclear waste storage tank that undergoes periodic agitation from a mixing pump. The procedure described here used multi-way principal components analysis to develop a statistical model of the process based on historical data. The model can be used to determine if changes have occurred in the system. At present this procedure is used off-line for monitoring but it could be implemented on-line.


IFAC Proceedings Volumes | 1995

The Process Chemometrics Approach to Process Monitoring and Fault Detection

Barry M. Wise; Neal B. Gallagher

Abstract Chemometrics, the application of mathematical and statistical methods to the analysis of chemical data, is finding ever widening applications in the chemical process environment. This article reviews the chemometrics approach to chemical process monitoring and fault detection. These approaches rely on the formation of a mathematical/statistical model that is based on historical process data. Process data can then be compared with models of normal operation in order to detect a change in the system. Typical modeling approaches rely on principal components analysis, partial least squares and a variety of other chemometric methods. Applications where the ordered nature of the data is taken into account explicitly are also beginning to see use. This article reviews the state-of-the art of process chemometrics and current trends in research and applications.


Analytica Chimica Acta | 2003

Estimation of trace vapor concentration-pathlength in plumes for remote sensing applications from hyperspectral images

Neal B. Gallagher; Barry M. Wise; David M. Sheen

A novel approach for quantification of chemical vapor effluents in stack plumes using infrared hyperspectral imaging are presented and examined. The algorithms use a novel application of the extended mixture model to provide estimates of background clutter in the on-plume pixel. These estimates are then used iteratively to improve the quantification. The final step in the algorithm employs either an extended least-squares (ELS) or generalized least-squares (GLS) procedure. It was found that the GLS weighting procedure generally performed better than ELS, but they performed similarly when the analyte spectra had relatively narrow features. The algorithms require estimates of the atmospheric radiance and transmission from the target plume to the imaging spectrometer and an estimate of the plume temperature. However, estimates of the background temperature and emissivity are not required which is a distinct advantage. The algorithm effectively provides a local estimate of the clutter, and an error analysis shows that it can provide superior quantification over approaches that model the background clutter in a more global sense. It was also found that the estimation error depended strongly on the net analyte signal for each analyte, and this quantity is scenario-specific.


Chemometrics and Intelligent Laboratory Systems | 1995

A comparison of neural networks, non-linear biased regression and a genetic algorithm for dynamic model identification

Barry M. Wise; Bradley R. Holt; Neal B. Gallagher; Samuel Lee

Abstract A variety of non-linear modeling techniques were applied to a single input/single output dynamic model identification problem. Results of the tests show that the prediction error of an artificial neural network with direct linear feed through terms is nearly as good or better than the other methods when tested on new data. However, non-linear models with nearly equal and occasionally better performance can be developed (including the selection of the model form and order) with a genetic algorithm (GA) in far less computer time. The GA derived models have the additional advantage of being more parsimonious and can be reparameterized, if need be, extremely rapidly. The non-linear biased regression techniques tested typically had larger, though possibly acceptable, prediction errors. These model structures offer the advantage of low computational requirements and reproducibility, i.e. the same model is produced each time for a given data set.


IFAC Proceedings Volumes | 1997

Development and Benchmarking of Multivariate Statistical Process Control Tools for a Semiconductor ETCH Process: Impact of Measurement Selection and Data Treatment on Sensitivity

Barry M. Wise; Neal B. Gallagher; Stephanie Watts Butler; Daniel D. White; Gabriel G. Barna

Abstract Multivariate Statistical Process Control (MSPC) tools have been developed for monitoring a Lam 9600 TCP Metal Etcher at Texas Instruments. These tools are used to determine if the etch process is operating normally or if a system fault has occurred. Application of these methods is complicated because the etch process data exhibits a large amount of normal systematic variation. Variations due to faults of process concern can be relatively minor in comparison. The Lam 9600 used in this study is equipped with several sensor systems including engineering variables (e.g. pressure, gas flow rates and power), spatially resolved Optical Emission Spectroscopy (OES) of the plasma and a Radio Frequency Monitoring (RFM) system to monitor the power and phase relationships of the plasma generator. A variety of analysis methods and data preprocessing techniques have been tested for their sensitivity to specific system faults. These methods have been applied to data from each of the sensor systems separately and in combination. The performance of the methods on a set of benchmark fault detection problems will be presented and the strengths and weaknesses of the methods will be discussed, along with the relative advantages of each of the sensor systems.


Applied Spectroscopy | 2003

Error Analysis for Estimation of Trace Vapor Concentration Pathlength in Stack Plumes

Neal B. Gallagher; Barry M. Wise; David M. Sheen

Near-infrared hyperspectral imaging is finding utility in remote sensing applications such as detection and quantification of chemical vapor effluents in stack plumes. Optimizing the sensing system or quantification algorithms is difficult because reference images are rarely well characterized. The present work uses a radiance model for a down-looking scene and a detailed noise model for dispersive and Fourier transform spectrometers to generate well-characterized synthetic data. These data were used with a classical least-squares-based estimator in an error analysis to obtain estimates of different sources of concentration-pathlength quantification error in the remote sensing problem. Contributions to the overall quantification error were the sum of individual error terms related to estimating the background, atmospheric corrections, plume temperature, and instrument signal-to-noise ratio. It was found that the quantification error depended strongly on errors in the background estimate and second-most on instrument signal-to-noise ratio. Decreases in net analyte signal (e.g., due to low analyte absorbance or increasing the number of analytes in the plume) led to increases in the quantification error as expected. These observations have implications on instrument design and strategies for quantification. The outlined approach could be used to estimate detection limits or perform variable selection for given sensing problems.


Analytica Chimica Acta | 2003

Classical Least Squares Transformations of Sensor Array Pattern Vectors into Vapor Descriptors. Simulation of Arrays of Polymer-Coated Surface Acoustic Wave Sensors with Mass-Plus-Volume Transduction Mechanisms

Jay W. Grate; Barry M. Wise; Neal B. Gallagher

Abstract A new method of processing multivariate response data to extract chemical information has been developed. Sensor array response patterns are transformed into a vector containing values for solvation parameter descriptors of the detected vapor’s properties. These results can be obtained by using a method similar to classical least squares (CLS), and equations have been derived for mass- or volume-transducing sensors. Polymer-coated acoustic wave devices are an example of mass-transducing sensors. However, some acoustic wave sensors, such as polymer-coated surface acoustic wave (SAW) devices give responses resulting from both mass-loading and decreases in modulus. The latter effect can be modeled as a volume effect. In this paper, we derive solutions for obtaining descriptor values from arrays of mass-plus-volume-transducing sensors. Simulations were performed to investigate the effectiveness of these solutions and compared with solutions for purely mass-transducing sensor arrays. It is concluded that this new method of processing sensor array data can be applied to SAW sensor arrays even when the modulus changes contribute to the responses. The simulations show that good estimations of vapor descriptors can be obtained by using a closed form estimation approach that is similar to the closed form solution for purely mass-transducing sensor arrays. Estimations can be improved using a nonlinear least squares optimization method. The results also suggest ways to design SAW arrays to obtain the best results, either by minimizing the volume sensitivity or matching the volume sensitivities in the array.

Collaboration


Dive into the Neal B. Gallagher's collaboration.

Top Co-Authors

Avatar

Barry M. Wise

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Jay W. Grate

Pacific Northwest National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David M. Sheen

Pacific Northwest National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

E.B. Martin

University of Newcastle

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge