Markku Mäkitalo
Tampere University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Markku Mäkitalo.
IEEE Transactions on Image Processing | 2011
Markku Mäkitalo; Alessandro Foi
The removal of Poisson noise is often performed through the following three-step procedure. First, the noise variance is stabilized by applying the Anscombe root transformation to the data, producing a signal in which the noise can be treated as additive Gaussian with unitary variance. Second, the noise is removed using a conventional denoising algorithm for additive white Gaussian noise. Third, an inverse transformation is applied to the denoised signal, obtaining the estimate of the signal of interest. The choice of the proper inverse transformation is crucial in order to minimize the bias error which arises when the nonlinear forward transformation is applied. We introduce optimal inverses for the Anscombe transformation, in particular the exact unbiased inverse, a maximum likelihood (ML) inverse, and a more sophisticated minimum mean square error (MMSE) inverse. We then present an experimental analysis using a few state-of-the-art denoising algorithms and show that the estimation can be consistently improved by applying the exact unbiased inverse, particularly at the low-count regime. This results in a very efficient filtering solution that is competitive with some of the best existing methods for Poisson image denoising.
IEEE Transactions on Image Processing | 2013
Markku Mäkitalo; Alessandro Foi
Many digital imaging devices operate by successive photon-to-electron, electron-to-voltage, and voltage-to-digit conversions. These processes are subject to various signal-dependent errors, which are typically modeled as Poisson-Gaussian noise. The removal of such noise can be effected indirectly by applying a variance-stabilizing transformation (VST) to the noisy data, denoising the stabilized data with a Gaussian denoising algorithm, and finally applying an inverse VST to the denoised data. The generalized Anscombe transformation (GAT) is often used for variance stabilization, but its unbiased inverse transformation has not been rigorously studied in the past. We introduce the exact unbiased inverse of the GAT and show that it plays an integral part in ensuring accurate denoising results. We demonstrate that this exact inverse leads to state-of-the-art results without any notable increase in the computational complexity compared to the other inverses. We also show that this inverse is optimal in the sense that it can be interpreted as a maximum likelihood inverse. Moreover, we thoroughly analyze the behavior of the proposed inverse, which also enables us to derive a closed-form approximation for it. This paper generalizes our work on the exact unbiased inverse of the Anscombe transformation, which we have presented earlier for the removal of pure Poisson noise.
2009 International Workshop on Local and Non-Local Approximation in Image Processing | 2009
Markku Mäkitalo; Alessandro Foi
The removal of Poisson noise is often performed through the following three-step procedure. First, the noise variance is stabilized by applying the Anscombe root transformation to the data, producing a signal in which the noise can be treated as additive Gaussian noise with unitary variance. Second, the noise is removed using a conventional denoising algorithm for additive white Gaussian noise. Third, an inverse transformation is applied to the denoised signal, obtaining the estimate of the signal of interest. The choice of the proper inverse transformation is crucial in order to minimize the bias error which arises when the nonlinear forward transformation is applied. We present an experimental analysis using a few state-of-the-art denoising algorithms and show that the estimation can be consitently improved by applying the exact unbiased inverse, particularly at the low-count regime.
IEEE Transactions on Image Processing | 2011
Markku Mäkitalo; Alessandro Foi
We presented an exact unbiased inverse of the Anscombe variance-stabilizing transformation in [M. Mäkitalo and A. Foi, “Optimal inversion of the Anscombe transformation in low-count Poisson image denoising,” IEEE Trans. Image Process., vol. 20, no. 1, pp. 99-109, Jan. 2011.] and showed that when applied to Poisson image denoising, the combination of variance stabilization and state-of-the-art Gaussian denoising algorithms is competitive with some of the best Poisson denoising algorithms. We also provided a MATLAB implementation of our method, where the exact unbiased inverse transformation appears in nonanalytical form. Here, we propose a closed-form approximation of the exact unbiased inverse in order to facilitate the use of this inverse. The proposed approximation produces results equivalent to those obtained with the accurate (nonanalytical) exact unbiased inverse, and thus, notably better than one would get with the asymptotically unbiased inverse transformation that is commonly used in applications.
IEEE Transactions on Image Processing | 2014
Markku Mäkitalo; Alessandro Foi
In digital imaging, there is often a need to produce estimates of the parameters that define the chosen noise model. We investigate how the mismatch between the estimated and true parameter values affects the stabilization of variance of signal-dependent noise. As a practical application of the general theoretical considerations, we devise a novel approach for estimating Poisson-Gaussian noise parameters from a single image, combining variance-stabilization and noise estimation for additive Gaussian noise. Furthermore, we construct a simple algorithm implementing the devised approach. We observe that when combined with optimized rational variance-stabilizing transformations, the algorithm produces results that are competitive with those of a state-of-the-art Poisson-Gaussian estimator.
international conference on mathematical methods in electromagnetic theory | 2010
Markku Mäkitalo; Alessandro Foi; Dmitriy V. Fevralev; Vladimir V. Lukin
Synthetic-aperture radar (SAR) imaging has become an efficient tool for obtaining and retrieving useful information about surfaces of Earth and other planets. However, the formed images suffer from speckle noise, especially if single-look observation mode is used. Then, filtering is often applied to improve image quality and provide better estimation of radar cross-section and other parameters of sensed scenes. Recently, a novel class of image filters has proved to be very successful in the removal of additive white Gaussian noise from natural images; these filters are based on nonlocal image modeling, i.e. they exploit the mutual self-similarity of image patches at different locations in the image. These filters have been shown in several benchmarks to significantly outperform all previous techniques. In this paper, we evaluate the performance of nonlocal filters applied to the denoising of single-look SAR images corrupted by speckle with a Rayleigh distribution, taking advantage of exact forward and inverse variance-stabilizing transformations. Numerical simulations demonstrate the success of this approach against several known despeckling methods.
international conference on acoustics, speech, and signal processing | 2012
Markku Mäkitalo; Alessandro Foi
The characteristic errors of many digital imaging devices can be modelled as Poisson-Gaussian noise, the removal of which can be approached indirectly through variance stabilization. The generalized Anscombe transformation (GAT) is commonly used for stabilization, but rigorous studies regarding its unbiased inverse transformation have been neglected. We introduce the exact unbiased inverse of the GAT, show that it is of essential importance for ensuring accurate denoising, and demonstrate that our approach leads to state-of-the-art results. This paper generalizes our earlier work, in which we presented an exact unbiased inverse of the Anscombe transformation for the case of pure Poisson noise removal.
Proceedings of SPIE | 2011
Markku Mäkitalo; Alessandro Foi
The block-matching and 3-D filtering (BM3D) algorithm is currently one of the most powerful and effective image denoising procedures. It exploits a specific nonlocal image modelling through grouping and collaborative filtering. Grouping finds mutually similar 2-D image blocks and stacks them together in 3-D arrays. Collaborative filtering produces individual estimates of all grouped blocks by filtering them jointly, through transform-domain shrinkage of the 3-D arrays (groups). BM3D can be combined with transform-domain alpha-rooting in order to simultaneously sharpen and denoise the image. Specifically, the thresholded 3-D transform-domain coefficients are modified by taking the alpha-root of their magnitude for some alpha > 1, thus amplifying the differences both within and between the grouped blocks. While one can use a constant (global) alpha throughout the entire image, further performance can be achieved by allowing different degrees of sharpening in different parts of the image, based on content-dependent information. We propose to vary the value of alpha used for sharpening a group through weighted estimates of the low-frequency, edge, and high-frequency content of the average block in the group. This is shown to be a viable approach for image sharpening, and in particular it can provide an improvement (both visually and in terms of PSNR) over its global non-adaptive counterpart.
international joint conference on computer vision imaging and computer graphics theory and applications | 2018
Timo Viitanen; Matias Koskela; Kalle Immonen; Markku Mäkitalo; Pekka Jääskeläinen; Jarmo Takala
Ray tracing is an interesting rendering technique, but remains too slow for real-time applications. There are various algorithmic methods to speed up ray tracing through uneven screen-space sampling, e.g., foveated rendering where sampling is directed by eye tracking. Uneven sampling methods tend to require at least one sample per pixel, limiting their use in real-time rendering. We review recent work on image reconstruction from arbitrarily distributed samples, and argue that these will play major role in the future of real-time ray tracing, allowing a larger fraction of samples to be focused on regions of interest. Potential implementation approaches and challenges are discussed.
Archive | 2013
Markku Mäkitalo