Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen P. Gurden is active.

Publication


Featured researches published by Stephen P. Gurden.


Chemometrics and Intelligent Laboratory Systems | 2000

Generalized contribution plots in multivariate statistical process monitoring

Johan A. Westerhuis; Stephen P. Gurden; Age K. Smilde

Abstract This paper discusses contribution plots for both the D -statistic and the Q -statistic in multivariate statistical process control of batch processes. Contributions of process variables to the D -statistic are generalized to any type of latent variable model with or without orthogonality constraints. The calculation of contributions to the Q -statistic is discussed. Control limits for both types of contributions are introduced to show the relative importance of a contribution compared to the contributions of the corresponding process variables in the batches obtained under normal operating conditions. The contributions are introduced for off-line monitoring of batch processes, but can easily be extended to on-line monitoring and to continuous processes, as is shown in this paper.


Chemical Engineering Science | 2002

Critical evaluation of approaches for on-line batch process monitoring

Eric N.M. van Sprang; Henk-Jan Ramaker; Johan A. Westerhuis; Stephen P. Gurden; Age K. Smilde

Since the introduction of batch process monitoring using component models in 1992, different approaches for statistical batch process monitoring have been suggested in the literature. This is the first evaluation of five proposed approaches so far. The differences and similarities between the approaches are highlighted. The derivation of control charts for these approaches are discussed. A control chart should give a fast and reliable detection of disturbances in the process. These features are evaluated for each approach by means of two performance indices. First, the action signal time for various disturbed batches is tested. Secondly, the probability of a false warning in a control chart is computed. In order to evaluate the five approaches, five different data sets are studied: one simulation of a batch process, three batch processes obtained from industry and one laboratory spectral data set. The obtained results for the performance indices are summarised and discussed. Recommendations helpful for practical use are given.


Chemometrics and Intelligent Laboratory Systems | 2001

A comparison of multiway regression and scaling methods

Stephen P. Gurden; Johan A. Westerhuis; Rasmus Bro; Age K. Smilde

Abstract Recent years have seen an increase in the number of regression problems for which the predictor and/or response arrays have orders higher than two, i.e. multiway data. Examples are found in, e.g. industrial batch process analysis, chemical calibration using second-order instrumentation and quantitative structure–activity relationships (QSAR). As these types of problems increase in complexity in terms of both the dimensions and the underlying structures of the data sets, the number of options with respect to different types of scaling and regression models also increases. In this article, three methods for multiway regression are compared: unfold partial least squares (PLS), multilinear PLS and multiway covariates regression (MCovR). All three methods differ either in the structural model imposed on the data or the way the model components are calculated. Three methods of scaling multiway arrays are also compared, along with the option of applying no scaling. Three data sets drawn from industrial processes are used in the comparison. The general conclusion is that the type of data and scaling used is more important than the type of regression model used in terms of predictive ability. The models do differ, however, in terms of interpretability.


Journal of Chemometrics | 2000

Standardized Q-statistic for improved sensitivity in the monitoring of residuals in MSPC

Johan A. Westerhuis; Stephen P. Gurden; Age K. Smilde

This paper presents the standardized Q‐statistic for monitoring residuals of latent variable models in multivariate statistical process control (MSPC). Before the summation of the squared residuals, they are scaled according to their expected variation obtained from normal operating conditions (NOC) data. Data from a simulated batch process and from an industrial batch process are used to show that this scaling improves the sensitivity of the Q‐statistic considerably. The standardized Q‐statistic is introduced for the off‐line monitoring of batch processes, but it can also be used for the monitoring of continuous processes as well as for the on‐line monitoring of batch processes. Copyright


Journal of Process Control | 2002

Improved monitoring of batch processes by incorporating external information

Henk-Jan Ramaker; E.N.M. van Sprang; Stephen P. Gurden; Johan A. Westerhuis; Age K. Smilde

Abstract In this paper an overview is given of statistical process monitoring with the emphasis on batch processes and the possible steps to take for improving this by incorporating external information. First, the general concept of statistical process monitoring of batches is explained. This concept has already been shown to be successful according to the number of references to industrial applications. The performance of statistical process monitoring of batch processes can be enhanced by incorporating external information. Two types of external information can be distinguished: batch-run specific and process specific information. Various examples of both types of external information are given. Several ideas of how to incorporate the external information in model development are discussed. The concept of incorporating process specific information is highlighted by an example of a grey model. This model is applied to a biochemical batch process that is spectroscopically monitored.


Journal of Microscopy | 2004

Quantitative analysis and classification of AFM images of human hair

Stephen P. Gurden; V. F. Monteiro; E. Longo; Márcia M. C. Ferreira

The surface topography of human hair, as defined by the outer layer of cellular sheets, termed cuticles, largely determines the cosmetic properties of the hair. The condition of the cuticles is of great cosmetic importance, but also has the potential to aid diagnosis in the medical and forensic sciences. Atomic force microscopy (AFM) has been demonstrated to offer unique advantages for analysis of the hair surface, mainly due to the high image resolution and the ease of sample preparation. This article presents an algorithm for the automatic analysis of AFM images of human hair. The cuticular structure is characterized using a series of descriptors, such as step height, tilt angle and cuticle density, allowing quantitative analysis and comparison of different images. The usefulness of this approach is demonstrated by a classification study. Thirty‐eight AFM images were measured, consisting of hair samples from (a) untreated and bleached hair samples, and (b) the root and distal ends of the hair fibre. The multivariate classification technique partial least squares discriminant analysis is used to test the ability of the algorithm to characterize the images according to the properties of the hair samples. Most of the images (86%) were found to be classified correctly.


Analytica Chimica Acta | 2000

Calibration and detailed analysis of second-order flow injection analysis data with rank overlap

Marlon M. Reis; Stephen P. Gurden; Age K. Smilde; Márcia M. C. Ferreira

Abstract With the current popularity of second-order (or hyphenated) instruments, there now exists a number of chemometric techniques for the so-called second-order calibration problem, i.e. that of quantifying an analyte of interest in the presence of one (or more) unknown interferent(s). Second-order instruments produce data of varying complexity, one particular phenomenon sometimes encountered being that of rank overlap (or rank deficiency), where the overall rank of the data is not equal to the sum of the ranks of the contributing species. The purpose of the present work is to evaluate the performance of two second-order calibration methods, a least squares-based and an eigenvalue-based solution, in terms of their quantitative ability and stability, as applied to flow injection analysis (FIA) data which exhibits rank overlap. In the presence of high collinearity in the data, the least squares methods is found to give a more stable solution. Two-mode component analysis (TMCA) is used to investigate the reasons for this difference in terms of the chemical properties of the species analysed. The success of second-order calibration of this data is found to depend strongly on the collinearity between the acidic and basic time profiles and the reproducibility of the pH gradient in the FIA channel, both of which are shown to be related to the pKa values of the species.


Applied Spectroscopy | 2003

Near-infrared spectroscopic monitoring of a series of industrial batch processes using a bilinear grey model

Eric N.M. van Sprang; Henk-Jan Ramaker; Johan A. Westerhuis; Age K. Smilde; Stephen P. Gurden; Dietrich Wienke

A good process understanding is the foundation for process optimization, process monitoring, end-point detection, and estimation of the end-product quality. Performing good process measurements and the construction of process models will contribute to a better process understanding. To improve the process knowledge it is common to build process models. These models are often based on first principles such as kinetic rates or mass balances. These types of models are also known as hard or white models. White models are characterized by being generally applicable but often having only a reasonable fit to real process data. Other commonly used types of models are empirical or black-box models such as regression and neural nets. Black-box models are characterized by having a good data fit but they lack a chemically meaningful model interpretation. Alternative models are grey models, which are combinations of white models and black models. The aim of a grey model is to combine the advantages of both black-box models and white models. In a qualitative case study of monitoring industrial batches using near-infrared (NIR) spectroscopy, it is shown that grey models are a good tool for detecting batch-to-batch variations and an excellent tool for process diagnosis compared to common spectroscopic monitoring tools.


Chemometrics and Intelligent Laboratory Systems | 1998

The introduction of process chemometrics into an industrial pilot plant laboratory

Stephen P. Gurden; E.B. Martin; A.J. Morris

Abstract Process chemometrics is the application of multivariate statistical methods to industrial process data characterised by a large number of correlated process measurements. In this paper, we aim to show how multivariate techniques have been used in a pilot plant environment with the objective of increasing the general understanding of the process despite having access to limited data. The use of process trajectory plots to follow the operation of the plant are discussed, along with statistical indicators for the detection and diagnosis of process disturbances. The effect of process conditions on product quality is analysed using cross-correlation with latent variables and significant process variables and time delay structures are identified. The experience and process understanding gained by the pilot plant staff has enabled them to propose the installation of new sensors and analysers based upon sound business benefits.


Analyst | 1995

Resolution of mid-infrared spectra by factor analysis using spherical projections: influence of noise, spectral similarity, wavelength resolution and mixture composition on success of the method

Stephen P. Gurden; Richard G. Brereton; John A. Groves

A new method for the resolution and recovery of mid-infrared spectra by factor analysis is described. The key to the method is to determine a few ‘composition-one’ points in a set of mixture spectra, where one component uniquely absorbs. The method involves filtering the data using Savitzky–Golay filters, performing principal components analysis, elimination of composition-zero (noise) points, normalization of scores (projection onto the surface of a hypersphere), determining the best N composition-one points for each compound, and finally factor rotation/recovery of spectra. The method is evaluated using two criteria of success namely, the number of true composition-one points recovered and the correlation between true and recovered spectra. The influence of spectral similarity, spectral resolution, component concentration, noise levels, and cut-off threshold is investigated on two separate simulated datasets. Finally, the method is shown to work on a real dataset.

Collaboration


Dive into the Stephen P. Gurden's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John A. Groves

Health and Safety Executive

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marlon M. Reis

State University of Campinas

View shared research outputs
Top Co-Authors

Avatar

Frank van der Meulen

Delft University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge