Jef Vanlaer
Katholieke Universiteit Leuven
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jef Vanlaer.
Computers & Chemical Engineering | 2012
Pieter Van den Kerkhof; Geert Gins; Jef Vanlaer; Jan Van Impe
To ensure constant and satisfactory product quality, close monitoring of batch processes is an absolute requirement in the (bio)chemical industry. Principal Component Analysis (PCA)-based techniques exploit historical databases for fault detection and diagnosis. In this paper, the fault detection and diagnosis performance of Batch Dynamic PCA (BDPCA) and Auto-Regressive PCA (ARPCA) is compared with Multi-way PCA (MPCA). Although these methods have been studied before, the performance is often compared based on few validation batches. Additionally, the focus is on fast fault detection, while correct fault identification is often considered of lesser importance. In this paper, MPCA, BDPCA, and ARPCA are benchmarked on an extensive dataset of a simulated penicillin fermentation. Both the detection speed, false alarm rate and correctness of the fault diagnosis are taken into account. The results indicate increased detection speed when using ARPCA as opposed to MPCA and BDPCA at the cost of fault classification accuracy.
Computers & Chemical Engineering | 2013
Jef Vanlaer; Geert Gins; Jan Van Impe
Abstract This paper studies batch-end quality prediction using Partial Least Squares (PLS). The applicability of the zeroth-order approximation of Faber and Kowalski (1997) for estimation of the PLS prediction variance is critically assessed. The estimator was originally developed for spectroscopy calibration and its derivation involves a local linearization under specific assumptions, followed by a further approximation. Although the assumptions do not hold for batch process monitoring in general, they are not violated for the selected case study. Based on extensive Monte Carlo simulations, the influence of noise variance, number of components and number of training batches on the bias and variability of the variance estimation is investigated. The results indicate that the zeroth-order approximation is too restrictive for batch process data. The development of a variance estimator based on a full local linearization is required to obtain more reliable variance estimations for the development of prediction intervals.
Computers & Chemical Engineering | 2014
Geert Gins; Jef Vanlaer; Pieter Van den Kerkhof; Jan Van Impe
Abstract This work presents the RAYMOND simulation package for generating RAYpresentative MONitoring Data. RAYMOND is a free MATLAB package and can simulate a wide range of processes; a number of widely-used benchmark processes are available, but user-defined processes can easily be added. Its modular design results in large flexibility with respect to the simulated processes: input fluctuations resulting from upstream variability can be introduced, sensor properties (measurement noise, resolution, range, etc.) can be freely specified, and various (custom) control strategies can be implemented. Furthermore, process variability (biological variability or non-ideal behavior) can be included, as can process-specific disturbances. In two case studies, the importance of including non-ideal behavior for monitoring and control of batch processes is illustrated. Hence, it should be included in benchmarks to better assess the performance and robustness of advanced process monitoring and control algorithms.
IFAC Proceedings Volumes | 2009
Geert Gins; Jef Vanlaer; J.F. Van Impe
Abstract This paper compares two methodologies for obtaining batch-end quality predictions by means of a partial least squares (PLS) model based on incomplete observations. The first method for dealing with the unknown nature of future measurements is to employ Intermediate Models (IMs). The second approach is Trimmed Scores Regression (TSR), which uses an additional regression model to compensate for the missing measurements. Both methodologies are tested on penicillin fermentation and industrial polymerization data. The case studies indicate that IMs and TSR yield comparable results, with IMs providing more accurate batch-end quality predictions during the initial stages of the batch. By optimizing the input variables for the full PLS model, the performance of TSR is comparable to that of IMs immediately from the start of the batch on. Because the identification of multiple IMs is labor-intensive, it is, therefore, concluded that TSR is a valid alternative when frequent online estimates of the final batch quality are requested: the small loss in accuracy (if any) does not outweigh the significant reduction in required workload. Hence, in the investigated cases, laziness generally pays off.
international conference on data mining | 2012
Jef Vanlaer; Pieter Van den Kerkhof; Geert Gins; Jan Van Impe
In this paper, the influence of measurement noise on batch-end quality prediction by Partial Least Squares (PLS) is discussed. Realistic computer-generated data of an industrial process for penicillin production are used to investigate the influence of both input and output noise on model input and model order selection, and online and offline prediction of the final penicillin concentration. Techniques based on PLS show a large potential in assisting human operators in their decisions, especially for batch processes where close monitoring is required to achieve satisfactory product quality. However, many (bio)chemical companies are still reluctant to implement these monitoring techniques since, among other things, little is known about the influence of measurement noise characteristics on their performance. The results of this study indicate that PLS predictions are only slightly worsened by the presence of measurement noise. Moreover, for the considered case study, model predictions are better than offline quality measurements.
IFAC Proceedings Volumes | 2012
Geert Gins; Jef Vanlaer; P. Van den Kerkhof; J.F. Van Impe
Abstract This paper investigates whether the batch optimization methodology recently proposed by McCready [2011] can be extended to full online batch optimization and control. McCready [2011] requires full factorial experiments for ( i ) the times where control actions are allowed and ( ii ) the manipulated variables; this is feasible for the three considered decision times. For true online batch control and optimization, however, control actions are required every few time points. This would result in a very large number of required experiments. In this paper, it is investigated whether all possible control action times must be included in the training data. Based on two case studies, it is concluded that accurate online estimates of the final batch quality are obtained even when the manipulated variables change at times not included in the training set, provided these changes occur at times in between the change times of the training batches. This is a valuable result for industrial acceptance because implies that fewer experiments are required for accurate model identification.
IFAC Proceedings Volumes | 2009
Jef Vanlaer; Geert Gins; J.F. Van Impe
Abstract A close monitoring of batch processes in the chemical and biochemical industry is required to achieve a satisfactory product quality at the end of a batch run. To deal with the inherent auto- and cross-correlation within batch process data, specific techniques, based on Principal Component Analysis (PCA) were developed for monitoring and fault detection purposes. In this paper, the fault detection performances of two such techniques, AutoRegressive PCA [ARPCA; Choi et al. , 2008. Dynamic model-based batch process monitoring. Chemical Engineering Science 63:622–636] and Batch Dynamic PCA [BDPCA; Chen and Liu, 2002. On-line batch process monitoring using dynamic PCA and dynamic PLS models. Chemical Engineering Science, 57:63–75], and a standard Multi-way PCA technique with Variable-wise unfolding (MPCAV) are compared. As a case study, a biochemical fed-batch process for penicillin fermentation is selected. From the extensive simulation results, it is clear that both ARPCA and BDPCA outperform the MPCAV procedure, with a significant reduction in fault detection times and, as a consequence, higher fault detection rates. The difference in performance between ARPCA and BDPCA, however, is very small. While ARPCA shows a smaller mean detection time over all test batches, the fault detection time for BDPCA is still smaller in a lot of individual batches. Therefore, it is concluded that, for the test cases presented in this paper, the ARPCA en BDPCA fault detection techniques show a similar performance.
IFAC Proceedings Volumes | 2012
Jef Vanlaer; P. Van den Kerkhof; Geert Gins; J.F. Van Impe
Abstract In this paper, an extensive Monte Carlo simulation is performed to investigate the influence of output measurement noise on Multiway Partial Least Squares (MPLS) batch-end quality predictions. MPLS models are well suited for monitoring (bio)chemical batch processes, but the lack of insight in noise influence leaves companies reluctant to accept the technique. Simplified relations between prediction variance and measurement noise exist for spectroscopy calibration problems, but are based on assumptions that do not necessarily hold for batch process modelling. The non-linear properties of the PLS predictor and the lack of knowledge about its statistical distribution make the derivation of an analytical relation extremely difficult. Based on an extensive case study of a penicillin production process, MPLS predictions of final batch quality are shown to outperform offline quality measurements. Even at very high noise levels, the models capture the important information in the measurements and discard most of the noise. Prediction bias and variance are studied and found to behave inversely with respect to the model order. This inverse behaviour has important consequences for model order selection, which becomes a trade-off between bias and variance. In this light, several crossvalidation-based techniques for selection of the optimal number of principal components are compared. An adjusted Wolds R criterion proves to be slightly favorable to the minimum MSE and general Wolds R criterion.
IFAC Proceedings Volumes | 2012
P. Van den Kerkhof; Jef Vanlaer; Geert Gins; J.F. Van Impe
Abstract A new fault identification method for batch processes based on Least Squares Support Vector Machines (LS-SVMs; Suykens et al. [2002]) is proposed. Fault detection and fault diagnosis of batch processes is a difficult issue due to their dynamic nature. Principal Component Analysis (PCA)-based techniques have become popular for data-driven fault detection. While improvements have been made in handling dynamics and non-linearities, correct fault diagnosis of the process disturbance remains a difficult issue. In this work, a new data-driven diagnosis technique is developed using an LS-SVMs based statistical classifier. When a fault is detected, a small window of pretreated data is sent to the classifier to identify the fault. The proposed approach is validated on data generated with an expanded version of the Pensim simulator [Birol et al., 2002]. The simulated data contains faults from six different classes. The obtained results provide a proof of concept of the proposed technique and demonstrate the importance of appropriate data pretreatment.
IFAC Proceedings Volumes | 2011
Jef Vanlaer; P. Van den Kerkhof; Geert Gins; J.F. Van Impe
Abstract The development of automated process monitoring systems to assist human operators in their decisions is an important challenge for todays chemical and biochemical companies. Especially for batch processes, close monitoring is required to achieve a satisfactory product quality at the end of the batch operation. Techniques based on Partial Least Squares (PLS) were developed to obtain online predictions of the batch-end quality (e.g., product purity or concentration). However, a lot of (bio)chemical companies are still reluctant to implement these monitoring techniques since, among other things, not much is known about the influence of measurement noise on the prediction performance. In this paper, the influence of measurement noise on ( i ) input selection, ( ii ) model order selection, and ( iii ) on- and offline prediction performance for PLS-based prediction of batch-end quality is investigated. A (simulated) process for penicillin production is selected as a case study. The noise level influences selected model inputs since more variables become uninformative when more noise is present in the data. The model order is influenced as well as more important underlying phenomena are masked by the noise. As expected, higher noise levels result in lower offline prediction performance. However, online predictions can improve when more noise is present in the data, due to the selection of different inputs. Even at very large noise levels, accurate and stable predictions of the final penicillin concentration are obtained.