Arnold J. den Dekker
University of Antwerp
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Arnold J. den Dekker.
Physics in Medicine and Biology | 2007
Jan Sijbers; Dirk H. J. Poot; Arnold J. den Dekker; Wouter Pintjens
Estimation of the noise variance of a magnetic resonance (MR) image is important for various post-processing tasks. In the literature, various methods for noise variance estimation from MR images are available, most of which however require user interaction and/or multiple (perfectly aligned) images. In this paper, we focus on automatic histogram-based noise variance estimation techniques. Previously described methods are reviewed and a new method based on the maximum likelihood (ML) principle is presented. Using Monte Carlo simulation experiments as well as experimental MR data sets, the noise variance estimation methods are compared in terms of the root mean squared error (RMSE). The results show that the newly proposed method is superior in terms of the RMSE.
IEEE Transactions on Systems, Man, and Cybernetics | 2013
Paweł Stano; Zsófia Lendek; Jelmer Braaksma; Robert Babuska; Cees de Keizer; Arnold J. den Dekker
Nonlinear stochastic dynamical systems are commonly used to model physical processes. For linear and Gaussian systems, the Kalman filter is optimal in minimum mean squared error sense. However, for nonlinear or non-Gaussian systems, the estimation of states or parameters is a challenging problem. Furthermore, it is often required to process data online. Therefore, apart from being accurate, the feasible estimation algorithm also needs to be fast. In this paper, we review Bayesian filters that possess the aforementioned properties. Each filter is presented in an easy way to implement algorithmic form. We focus on parametric methods, among which we distinguish three types of filters: filters based on analytical approximations (extended Kalman filter, iterated extended Kalman filter), filters based on statistical approximations (unscented Kalman filter, central difference filter, Gauss-Hermite filter), and filters based on the Gaussian sum approximation (Gaussian sum filter). We discuss each of these filters, and compare them with illustrative examples.
Signal Processing | 2014
Jeny Rajan; Arnold J. den Dekker; Jan Sijbers
Denoising algorithms play an important role in the enhancement of magnetic resonance (MR) images. Effective denoising is vital for proper analysis and accurate quantitative measurements from MR images. Maximum Likelihood (ML) estimation methods were proved to be very effective in denoising MR images. Among the ML based methods, the recently proposed non-local maximum likelihood (NLML) approach gained much attention. In the NLML method, the samples for the ML estimation of the true underlying intensity are selected in a non-local way based on the intensity similarity of the pixel neighborhoods. This similarity is generally measured using the Euclidean distance. A drawback of this approach is the usage of a fixed sample size for the ML estimation resulting in over- or under-smoothing. In this work, we propose an NLML estimation method for denoising MR images in which the samples are selected in an adaptive and statistically supported way using the Kolmogorov-Smirnov (KS) similarity test. The method has been tested both on simulated and real data, showing its effectiveness. HighlightsAn NLML method for denoising MRI based on Kolmogorov-Smirnov (KS) similarity test is proposed.The proposed method is statistically convincing and performs better than the conventional NLML method.Through the proposed approach the samples for ML estimation can be selected in an adaptive way.Quantitative analysis at various noise levels based on the various similarity measures, shows that the proposed method is more effective than conventional NLML.
Magnetic Resonance in Medicine | 2015
Quinten Collier; Jelle Veraart; Ben Jeurissen; Arnold J. den Dekker; Jan Sijbers
Diffusion‐weighted magnetic resonance imaging suffers from physiological noise, such as artifacts caused by motion or system instabilities. Therefore, there is a need for robust diffusion parameter estimation techniques. In the past, several techniques have been proposed, including RESTORE and iRESTORE (Chang et al. Magn Reson Med 2005; 53:1088–1095; Chang et al. Magn Reson Med 2012; 68:1654–1663). However, these techniques are based on nonlinear estimators and are consequently computationally intensive.
Proceedings of SPIE Medical Imaging 1998: Image Processing / Hanson, Kenneth M. [edit.] | 1998
Jan Sijbers; Arnold J. den Dekker; Marleen Verhoye; E. Raman; Dirk Van Dyck
A Maximum Likelihood estimation technique is proposed for optimal estimation of Magnetic Resonance (MR) T2 maps from a set of magnitude MR images. Thereby, full use is made of the actual probability density function of the magnitude data, which is the Rician distribution. While equal in terms of precision, the proposed method is demonstrated to be superior in terms of accuracy compared to conventional relaxation parameter estimation techniques.
Magnetic Resonance in Medicine | 2016
Gwendolyn Van Steenkiste; Ben Jeurissen; Jelle Veraart; Arnold J. den Dekker; Paul M. Parizel; Dirk H. J. Poot; Jan Sijbers
Diffusion MRI is hampered by long acquisition times, low spatial resolution, and a low signal‐to‐noise ratio. Recently, methods have been proposed to improve the trade‐off between spatial resolution, signal‐to‐noise ratio, and acquisition time of diffusion‐weighted images via super‐resolution reconstruction (SRR) techniques. However, during the reconstruction, these SRR methods neglect the q‐space relation between the different diffusion‐weighted images.
conference on decision and control | 2009
Arturo Tejada; Wouter Van den Broek; Saartje W. van der Hoeven; Arnold J. den Dekker
Scanning transmission electron microscopes are indispensable tools for material science research, since they can reveal the internal structure of a wide range of specimens. Thus, it is of scientific and industrial interest to transform these microscopes into flexible, high-throughput, unsupervised, nanomeasuring tools. To do so, processes that are currently executed manually based on visual feedback (e.g., alignment or particle measurement) should be automated, taking into consideration their time dependencies. That is, these microscopes should be studied from the systems and control perspective. To the best of our knowledge, such perspective is lacking in the literature. Thus, it is provided here through a new modeling framework that facilitates the future development of control strategies based on image analysis. The progress made towards developing an image-based sensor for defocus control is also reported. Finally, the paper also aims to introduce scanning transmission electron microscopy as an important and untapped application area for control engineers.
Ultramicroscopy | 2011
Arturo Tejada; Arnold J. den Dekker; Wouter Van den Broek
Transmission electron microscopes (TEMs) are the tools of choice for academic and industrial research at the nano-scale. Due to their increasing use for routine, repetitive measurement tasks (e.g., quality control in production lines) there is a clear need for a new generation of high-throughput microscopes designed to autonomously extract information from specimens (e.g., particle size distribution, chemical composition, structural information, etc.). To aid in their development, a new engineering perspective on TEM design, based on principles from systems and control theory, is proposed here: measure-by-wire (not to be confused with remote microscopy). Under this perspective, the TEM operator yields the direct control of the microscopes internal processes to a hierarchy of feedback controllers and high-level supervisors. These make use of dynamical models of the main TEM components together with currently available measurement techniques to automate processes such as defocus correction or specimen displacement. Measure-by-wire is discussed in depth, and its methodology is illustrated through a detailed example: the design of a defocus regulator, a type of feedback controller that is akin to existing autofocus procedures.
IUCrJ | 2016
Sandra Van Aert; Annick De Backer; Gerardo T. Martinez; Arnold J. den Dekker; Dirk Van Dyck; Sara Bals; Gustaaf Van Tendeloo
An overview of statistical parameter estimation methods is presented and applied to analyse transmission electron microscopy images in a quantitative manner.
Ultramicroscopy | 2011
Arturo Tejada; Arnold J. den Dekker
Franks observation that a TEM bright-field image acquired under non-stationary conditions can be modeled by the time integral of the standard TEM image model [J. Frank, Nachweis von objektbewegungen im lichtoptis- chen diffraktogramm von elektronenmikroskopischen auf- nahmen, Optik 30 (2) (1969) 171-180.] is re-derived here using counting statistics based on Poissons binomial distribution. The approach yields a statistical image model that is suitable for image analysis and simulation.