Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where L.J. van Vliet is active.

Publication


Featured researches published by L.J. van Vliet.


Journal of Microscopy | 1997

A QUANTITATIVE COMPARISON OF IMAGE RESTORATION METHODS FOR CONFOCAL MICROSCOPY

G.M.P. van Kempen; L.J. van Vliet; Peter J. Verveer; H.T.M. van der Voort

In this paper, we compare the performance of three iterative methods for image restoration: the Richardson–Lucy algorithm, the iterative constrained Tikhonov–Miller algorithm (ICTM) and the Carrington algorithm. Both the ICTM and the Carrington algorithm are based on an additive Gaussian noise model, but differ in the way they incorporate the non‐negativity constraint. Under low light‐level conditions, this additive (Gaussian) noise model is a poor description of the actual photon‐limited image recording, compared with that of a Poisson process. The Richardson–Lucy algorithm is a maximum likelihood estimator for the intensity of a Poisson process. We have studied various methods for determining the regularization parameter of the ICTM and the Carrington algorithm and propose a (Gaussian) prefiltering to reduce the noise sensitivity of the Richardson–Lucy algorithm. The results of these algorithms are compared on spheres convolved with a point spread function and distorted by Poisson noise. Our simulations show that the Richardson–Lucy algorithm, with Gaussian prefiltering, produces the best result in most of the tests. Finally, we show an example of how restoration methods can improve quantitative analysis: the total amount of fluorescence inside a closed object is measured in the vicinity of another object before and after restoration.


international conference on pattern recognition | 1998

Recursive Gaussian derivative filters

L.J. van Vliet; Ian T. Young; P.W. Verbeek

We propose a strategy to design recursive implementations of the Gaussian filter and Gaussian regularized derivative filters. Each recursive filter consists of a cascade of two stable Nth-order subsystems (causal and anti-causal). The computational complexity is 2N multiplications per pixel per dimension independent of the size (/spl sigma/) of the Gaussian kernel. The filter coefficients have a closed-form solution as a function of scale (/spl sigma/) and recursion order N (N=3, 4, 5). The recursive filters yield a high accuracy and excellent isotropy in n-D space.


Graphical Models \/graphical Models and Image Processing \/computer Vision, Graphics, and Image Processing | 1989

A nonlinear laplace operator as edge detector in noisy images

L.J. van Vliet; Ian T. Young; G.L. Beckers

Abstract An edge detection scheme is developed robust enough to perform well over a wide range of signal-to-noise ratios. It is based upon the detection of zero crossings in the output image of a nonlinear Laplace filter. Specific characterizations of the nonlinear Laplacian are its adaptive orientation to the direction of the gradient and its inherent masks which permit the development of approximately circular (isotropic) filters. We have investigated the relation between the locally optimal filter parameters, smoothing size, and filter size, and the SNR of the image to be processed. A quantitative evaluation shows that our edge detector performs at least as well—and in most cases much better—than edge detectors. At very low signal-to-noise ratios, our edge detector is superior to all others tested.


Journal of Physics: Conference Series | 2008

Robust super-resolution without regularization

T.Q. Pham; L.J. van Vliet; Klamer Schutte

Super-resolution restoration is the problem of restoring a high-resolution scene from multiple degraded low-resolution images under motion. Due to imaging blur and noise, this problem is ill-posed. Additional constraints such as smoothness of the solution (i.e. regularization) is often required to obtain a stable solution. While regularizing the cost function is a standard practice in image restoration, we propose a restoration algorithm that does not require this extra regularization term. The robustness of the algorithm is achieved by a robust error norm that does not response to intensity outliers. With the outliers suppressed, our solution behaves similarly to a maximum-likelihood solution under the presence of Gaussian noise. The efiectiveness of our algorithm is demonstrated with super-resolution restoration of real infrared image sequences under severe aliasing and intensity outliers.


Journal of Microscopy | 1998

Theory of confocal fluorescence imaging in the Programmable Array Microscope (PAM).

Peter J. Verveer; Quentin S. Hanley; P.W. Verbeek; L.J. van Vliet; Thomas M. Jovin

The programmable array microscope (PAM) uses a spatial light modulator (SLM) to generate an arbitrary pattern of conjugate illumination and detection elements. The SLM dissects the fluorescent light imaged by the objective into a focal conjugate image, Ic, formed by the ‘in‐focus’ light, and a nonconjugate image, Inc, formed by the ‘out‐of‐focus’ light. We discuss two different schemes for confocal imaging using the PAM. In the first, a grid of points is shifted to scan the complete image. The second, faster approach, uses a short tiled pseudorandom sequence of two‐dimensional patterns. In the first case, Ic is analogous to a confocal image and Inc to a conventional image minus Ic. In the second case Ic and Inc are the sum and the difference, respectively, of a conventional and a confocal image. The pseudorandom sequence approach requires post‐processing to retrieve the confocal part, but generates significantly higher signal levels for an equivalent integration time.


Journal of Microscopy | 1997

Reconstruction of optical pathlength distributions from images obtained by a wide-field differential interference contrast microscope

E.B. van Munster; L.J. van Vliet; Jacob A. Aten

An image processing algorithm is presented to reconstruct optical pathlength distributions from images of nonabsorbing weak phase objects, obtained by a differential interference contrast (DIC) microscope, equipped with a charge‐coupled device camera. The method is demonstrated on DIC images of transparent latex spheres and unstained bovine spermatozoa. The images were obtained with a wide‐field DIC microscope, using monochromatic light. After image acquisition, the measured intensities were converted to pathlength differences. Filtering in the Fourier domain was applied to correct for the typical shadow‐cast effect of DIC images. The filter was constructed using the lateral shift introduced in the microscope, and parameters describing the spectral distribution of the signal‐to‐noise ratio. By varying these parameters and looking at the resulting images, an appropriate setting for the filter parameters was found. In the reconstructed image each grey value represents the optical pathlength at that particular location, enabling quantitative analysis of object parameters using standard image processing techniques. The advantage of using interferometric techniques is that measurements can be done on transparent objects, without staining, enabling observations on living cells. Quantitative use of images obtained by a wide‐field DIC microscope becomes possible with this technique, using relatively simple means.


Cytometry | 2000

Mean and variance of ratio estimators used in fluorescence ratio imaging

G.M.P. van Kempen; L.J. van Vliet

Background: The ratio of two measured fluorescence signals (called x and y) is used in different applications in fluorescence microscopy. Multiple instances of both signals can be combined in different ways to construct different ratio estimators. Methods: The mean and variance of three estimators for the ratio between two random variables, x and y, are discussed. Given n samples of x and y, we can intuitively construct two different estimators: the mean of the ratio of each x and y and the ratio between the mean of x and the mean of y. The former is biased and the latter is only asymptotically unbiased. Using the statistical characteristics of this estimator, a third, unbiased estimator can be constructed. Results: We tested the three estimators on simulated data, real-world fluorescence test images, and comparative genome hybridization (CGH) data. The results on the simulated and real-world test images confirm the presented theory. The CGH experiments show that our new estimator performs better than the existing estimators. Conclusions: We have derived an unbiased ratio estimator that outperforms intuitive ratio estimators. Cytometry 39:300 ‐305, 2000.


computer vision and pattern recognition | 1999

Edge preserving orientation adaptive filtering

P. Bakker; L.J. van Vliet; P.W. Verbeek

In this paper we describe a new stragegy for combining orientation adaptive filtering and edge preserving filtering. The filter adapts to the local orientation and avoids filtering across borders. The local orientation for steering the filter will be estimated in a fixed sized window which never contains two orientation fields. This can be achieved using generalized Kuwahara filtering. This filter selects from a set of fixed sized windows that contain the current pixel, the orientation of the window with the highest anisotropy. We compare out filter strategy with a multi-scale approach. We found that our filter strategy has a lower complexity and yields a constant improvement of the SNR.


Signal Processing | 1988

Low-level image processing by max–min filters

P.W. Verbeek; H.A. Vrooman; L.J. van Vliet

Abstract A systematic framework is given that accommodates existing max-min filter methods and suggests new ones. Putting the upper and lower envelopes UPP = MIN (MAX) and LOW = MAX (MIN) in the roles that MAX, MIN or original play in existing filters we can distinguish edges in ramp edges and texture (or noise) edges; all methods presented come in three versions: for edges, ramp edges and non-ramp (“texture”) edges. The ramp versions of Philips dynamic thresholding and Lee edge detection are considerably less noise sensitive. For images with little noise the texture version of dynamic thresholding brings out fine textures while ignoring ramps. Lee edge-detection can in all versions be extended to a sharp “Laplacian” and an edge enhancer. Starting out from square-full several shapes of the maximum filter are tried out. The round-full filter gives least artefacts; when crescent updating is used it takes size-linear rather than size-quadratic time. The suboptimal round-sparse filter takes size-independent time.


Medical Image Analysis | 2013

Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography

Hortense A. Kirisli; Michiel Schaap; Coert Metz; Anoeshka S. Dharampal; W. B. Meijboom; S. L. Papadopoulou; Admir Dedic; Koen Nieman; M. A. de Graaf; M. F. L. Meijs; M. J. Cramer; Alexander Broersen; Suheyla Cetin; Abouzar Eslami; Leonardo Flórez-Valencia; Kuo-Lung Lor; Bogdan J. Matuszewski; I. Melki; B. Mohr; Ilkay Oksuz; Rahil Shahzad; Chunliang Wang; Pieter H. Kitslaar; Gözde B. Ünal; Amin Katouzian; Maciej Orkisz; Chung-Ming Chen; Frédéric Precioso; Laurent Najman; S. Masood

Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with experts manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/.

Collaboration


Dive into the L.J. van Vliet's collaboration.

Top Co-Authors

Avatar

P.W. Verbeek

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Frans M. Vos

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Ian T. Young

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

L.R. van den Doel

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Bernd Rieger

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

G.M.P. van Kempen

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

M. van Ginkel

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

C.L. Luengo Hendriks

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

C. van Wijk

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Koenraad A. Vermeer

Delft University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge