Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Geert Gins is active.

Publication


Featured researches published by Geert Gins.


Computers & Chemical Engineering | 2012

Dynamic model-based fault diagnosis for (bio)chemical batch processes

Pieter Van den Kerkhof; Geert Gins; Jef Vanlaer; Jan Van Impe

To ensure constant and satisfactory product quality, close monitoring of batch processes is an absolute requirement in the (bio)chemical industry. Principal Component Analysis (PCA)-based techniques exploit historical databases for fault detection and diagnosis. In this paper, the fault detection and diagnosis performance of Batch Dynamic PCA (BDPCA) and Auto-Regressive PCA (ARPCA) is compared with Multi-way PCA (MPCA). Although these methods have been studied before, the performance is often compared based on few validation batches. Additionally, the focus is on fast fault detection, while correct fault identification is often considered of lesser importance. In this paper, MPCA, BDPCA, and ARPCA are benchmarked on an extensive dataset of a simulated penicillin fermentation. Both the detection speed, false alarm rate and correctness of the fault diagnosis are taken into account. The results indicate increased detection speed when using ARPCA as opposed to MPCA and BDPCA at the cost of fault classification accuracy.


Computers & Chemical Engineering | 2013

Quality assessment of a variance estimator for Partial Least Squares prediction of batch-end quality

Jef Vanlaer; Geert Gins; Jan Van Impe

Abstract This paper studies batch-end quality prediction using Partial Least Squares (PLS). The applicability of the zeroth-order approximation of Faber and Kowalski (1997) for estimation of the PLS prediction variance is critically assessed. The estimator was originally developed for spectroscopy calibration and its derivation involves a local linearization under specific assumptions, followed by a further approximation. Although the assumptions do not hold for batch process monitoring in general, they are not violated for the selected case study. Based on extensive Monte Carlo simulations, the influence of noise variance, number of components and number of training batches on the bias and variability of the variance estimation is investigated. The results indicate that the zeroth-order approximation is too restrictive for batch process data. The development of a variance estimator based on a full local linearization is required to obtain more reliable variance estimations for the development of prediction intervals.


Computers & Chemical Engineering | 2014

The RAYMOND simulation package — Generating RAYpresentative MONitoring Data to design advanced process monitoring and control algorithms

Geert Gins; Jef Vanlaer; Pieter Van den Kerkhof; Jan Van Impe

Abstract This work presents the RAYMOND simulation package for generating RAYpresentative MONitoring Data. RAYMOND is a free MATLAB package and can simulate a wide range of processes; a number of widely-used benchmark processes are available, but user-defined processes can easily be added. Its modular design results in large flexibility with respect to the simulated processes: input fluctuations resulting from upstream variability can be introduced, sensor properties (measurement noise, resolution, range, etc.) can be freely specified, and various (custom) control strategies can be implemented. Furthermore, process variability (biological variability or non-ideal behavior) can be included, as can process-specific disturbances. In two case studies, the importance of including non-ideal behavior for monitoring and control of batch processes is illustrated. Hence, it should be included in benchmarks to better assess the performance and robustness of advanced process monitoring and control algorithms.


industrial conference on data mining | 2011

Prediction of batch-end quality for an industrial polymerization process

Geert Gins; Bert Pluymers; Ilse Smets; Jairo Espinosa; Jan Van Impe

In this paper, an inferential sensor for the final viscosity of an industrial batch polymerization reaction is developed using multivariate statistical methods. This inferential sensor tackles one of the main problems of chemical batch processes: the lack of reliable online quality estimates. In a data preprocessing step, all batches are brought to equal lengths and significant batch events are aligned via dynamic time warping. Next, the optimal input measurements and optimal model order of the inferential multiway partial least squares (MPLS) model are selected. Finally, a full batch model is trained and successfully validated. Additionally, intermediate models capable of predicting the final product quality after only 50% or 75% batch progress are developed. All models provide accurate estimates of the final polymer viscosity.


IFAC Proceedings Volumes | 2005

Activated Sludge Image Analysis Data Classification: an LS-SVM Approach

Geert Gins; Ilse Smets; R Jenné; J.F. Van Impe

Abstract In this paper, a classifier is proposed and trained to distinguish between bulking and non-bulking situations in an activated sludge wastewater treatment plant, based on available image analysis information and with the goal of predicting and monitoring filamentous bulking. After selecting appropriate activated sludge parameters (filament length, floc fractal dimension and floc roundness), an LS-SVM approach is used to train a classification function. This classification function is shown to have a satisfactory performance after validation.


IFAC Proceedings Volumes | 2009

Online batch-end quality estimation: does laziness pay off

Geert Gins; Jef Vanlaer; J.F. Van Impe

Abstract This paper compares two methodologies for obtaining batch-end quality predictions by means of a partial least squares (PLS) model based on incomplete observations. The first method for dealing with the unknown nature of future measurements is to employ Intermediate Models (IMs). The second approach is Trimmed Scores Regression (TSR), which uses an additional regression model to compensate for the missing measurements. Both methodologies are tested on penicillin fermentation and industrial polymerization data. The case studies indicate that IMs and TSR yield comparable results, with IMs providing more accurate batch-end quality predictions during the initial stages of the batch. By optimizing the input variables for the full PLS model, the performance of TSR is comparable to that of IMs immediately from the start of the batch on. Because the identification of multiple IMs is labor-intensive, it is, therefore, concluded that TSR is a valid alternative when frequent online estimates of the final batch quality are requested: the small loss in accuracy (if any) does not outweigh the significant reduction in required workload. Hence, in the investigated cases, laziness generally pays off.


international conference on data mining | 2006

Data alignment via dynamic time warping as a prerequisite for batch-end quality prediction

Geert Gins; Jairo Espinosa; Ilse Smets; Wim Van Brempt; Jan Van Impe

In this work, a 4-phase dynamic time warping is implemented to align measurement profiles from an existing chemical batch reactor process, making all batch measurement profiles equal in length, while also matching the major events occurring during each batch run. This data alignment is the first step towards constructing an inferential batch-end quality sensor, capable of predicting 3 quality variables before batch run completion using a multivariate statistical partial least squares model. This inferential sensor provides on-line quality predictions, allowing corrective actions to be performed when the quality of the polymerization product does not meet the specifications, saving valuable production time and reducing operation cost.


industrial conference on data mining | 2016

Extending Process Monitoring to Simultaneous False Alarm Rejection and Fault Identification (FARFI)

Geert Gins; Sam Wuyts; Sander Van den Zegel; Jan Van Impe

A new framework for extending Statistical Process Monitoring (SPM) to simultaneous False Alarm Rejection and Fault Identification (FARFI) is presented in this paper. This is motivated by the possibly large negative impact on product quality, process safety, and profitability resulting from incorrect control actions induced by false alarms—especially for batch processes. The presented FARFI approach adapts the classification model already used for fault identification to simultaneously perform false alarm rejection by adding normal operation as an extra data class. As no additional models are introduced, the complexity of the overall SPM system is not increased.


international conference on data mining | 2012

The influence of input and output measurement noise on batch-end quality prediction with partial least squares

Jef Vanlaer; Pieter Van den Kerkhof; Geert Gins; Jan Van Impe

In this paper, the influence of measurement noise on batch-end quality prediction by Partial Least Squares (PLS) is discussed. Realistic computer-generated data of an industrial process for penicillin production are used to investigate the influence of both input and output noise on model input and model order selection, and online and offline prediction of the final penicillin concentration. Techniques based on PLS show a large potential in assisting human operators in their decisions, especially for batch processes where close monitoring is required to achieve satisfactory product quality. However, many (bio)chemical companies are still reluctant to implement these monitoring techniques since, among other things, little is known about the influence of measurement noise characteristics on their performance. The results of this study indicate that PLS predictions are only slightly worsened by the presence of measurement noise. Moreover, for the considered case study, model predictions are better than offline quality measurements.


IFAC Proceedings Volumes | 2012

Extending discrete batch-end quality optimization to online implementation

Geert Gins; Jef Vanlaer; P. Van den Kerkhof; J.F. Van Impe

Abstract This paper investigates whether the batch optimization methodology recently proposed by McCready [2011] can be extended to full online batch optimization and control. McCready [2011] requires full factorial experiments for ( i ) the times where control actions are allowed and ( ii ) the manipulated variables; this is feasible for the three considered decision times. For true online batch control and optimization, however, control actions are required every few time points. This would result in a very large number of required experiments. In this paper, it is investigated whether all possible control action times must be included in the training data. Based on two case studies, it is concluded that accurate online estimates of the final batch quality are obtained even when the manipulated variables change at times not included in the training set, provided these changes occur at times in between the change times of the training batches. This is a valuable result for industrial acceptance because implies that fewer experiments are required for accurate model identification.

Collaboration


Dive into the Geert Gins's collaboration.

Top Co-Authors

Avatar

Jan Van Impe

Catholic University of Leuven

View shared research outputs
Top Co-Authors

Avatar

Jef Vanlaer

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Pieter Van den Kerkhof

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Ilse Smets

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

J.F. Van Impe

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

R Jenné

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Sam Wuyts

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

E.N Banadda

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

P. Van den Kerkhof

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Jan Degrève

Katholieke Universiteit Leuven

View shared research outputs
Researchain Logo
Decentralizing Knowledge