Alex Sawatzky
University of Münster
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alex Sawatzky.
ieee nuclear science symposium | 2008
Alex Sawatzky; Christoph Brune; Frank Wübbeling; Thomas Kösters; Klaus P. Schäfers; Martin Burger
PET measurements of tracers with a lower dose rate or short radioactive half life suffer from extremely low SNRs. In these cases standard reconstruction methods (OSEM, EM, filtered backprojection) deliver unsatisfactory and noisy results. Here, we propose to introduce nonlinear variational methods into the reconstruction process to make an efficient use of a-priori information and to attain improved imaging results. We illustrate our technique by evaluating cardiac H215O measurements. The general approach can also be used for other specific goals allowing to incorporate a-priori information about the solution with Poisson distributed data.
International Journal of Computer Vision | 2011
Christoph Brune; Alex Sawatzky; Martin Burger
Measurements in nanoscopic imaging suffer from blurring effects modeled with different point spread functions (PSF). Some apparatus even have PSFs that are locally dependent on phase shifts. Additionally, raw data are affected by Poisson noise resulting from laser sampling and “photon counts” in fluorescence microscopy. In these applications standard reconstruction methods (EM, filtered backprojection) deliver unsatisfactory and noisy results. Starting from a statistical modeling in terms of a MAP likelihood estimation we combine the iterative EM algorithm with total variation (TV) regularization techniques to make an efficient use of a-priori information. Typically, TV-based methods deliver reconstructed cartoon images suffering from contrast reduction. We propose extensions to EM-TV, based on Bregman iterations and primal and dual inverse scale space methods, in order to obtain improved imaging results by simultaneous contrast enhancement. Besides further generalizations of the primal and dual scale space methods in terms of general, convex variational regularization methods, we provide error estimates and convergence rates for exact and noisy data. We illustrate the performance of our techniques on synthetic and experimental biological data.
computer analysis of images and patterns | 2009
Alex Sawatzky; Christoph Brune; Jahn Müller; Martin Burger
This paper deals with denoising of density images with bad Poisson statistics (low count rates), where the reconstruction of the major structures seems the only reasonable task. Obtaining the structures with sharp edges can also be a prerequisite for further processing, e.g. segmentation of objects. A variety of approaches exists in the case of Gaussian noise, but only a few in the Poisson case. We propose some total variation (TV) based regularization techniques adapted to the case of Poisson data, which we derive from approximations of logarithmic a-posteriori probabilities. In order to guarantee sharp edges we avoid the smoothing of the total variation and use a dual approach for the numerical solution. We illustrate and test the feasibility of our approaches for data in positron emission tomography, namely reconstructions of cardiac structures with 18F-FDG and H
Physics in Medicine and Biology | 2014
Qiaofeng Xu; Alex Sawatzky; Mark A. Anastasio; Carsten Oliver Schirra
_2 \, ^{15}
IEEE Transactions on Medical Imaging | 2014
Alex Sawatzky; Qiaofeng Xu; Carsten Oliver Schirra; Mark A. Anastasio
O tracers, respectively.
arXiv: Optimization and Control | 2016
Martin Burger; Alex Sawatzky; Gabriele Steidl
The development of spectral computed tomography (CT) using binned photon-counting detectors has garnered great interest in recent years and has enabled selective imaging of K-edge materials. A practical challenge in CT image reconstruction of K-edge materials is the mitigation of image artifacts that arise from reduced-view and/or noisy decomposed sinogram data. In this note, we describe and investigate sparsity-regularized penalized weighted least squares-based image reconstruction algorithms for reconstructing K-edge images from few-view decomposed K-edge sinogram data. To exploit the inherent sparseness of typical K-edge images, we investigate use of a total variation (TV) penalty and a weighted sum of a TV penalty and an ℓ1-norm with a wavelet sparsifying transform. Computer-simulation and experimental phantom studies are conducted to quantitatively demonstrate the effectiveness of the proposed reconstruction algorithms.
Archive | 2013
Alex Sawatzky; Christoph Brune; Thomas Kösters; Frank Wübbeling; Martin Burger
The development of spectral X-ray computed tomography (CT) using binned photon-counting detectors has received great attention in recent years and has enabled selective imaging of contrast agents loaded with K-edge materials. A practical issue in implementing this technique is the mitigation of the high-noise levels often present in material-decomposed sinogram data. In this work, the spectral X-ray CT reconstruction problem is formulated within a multi-channel (MC) framework in which statistical correlations between the decomposed material sinograms can be exploited to improve image quality. Specifically, a MC penalized weighted least squares (PWLS) estimator is formulated in which the data fidelity term is weighted by the MC covariance matrix and sparsity-promoting penalties are employed. This allows the use of any number of basis materials and is therefore applicable to photon-counting systems and K-edge imaging. To overcome numerical challenges associated with use of the full covariance matrix as a data fidelity weight, a proximal variant of the alternating direction method of multipliers is employed to minimize the MC PWLS objective function. Computer-simulation and experimental phantom studies are conducted to quantitatively evaluate the proposed reconstruction method.
ieee nuclear science symposium | 2011
Jahn Müller; Christoph Brune; Alex Sawatzky; Thomas Kösters; Klaus P. Schäfers; Martin Burger
The success of non-smooth variational models in image processing is heavily based on efficient algorithms. Taking into account the specific structure of the models as sum of different convex terms, splitting algorithms are an appropriate choice. Their strength consists in the splitting of the original problem into a sequence of smaller proximal problems which are easy and fast to compute.
Medical Physics | 2016
Qiaofeng Xu; Deshan Yang; Jun Tan; Alex Sawatzky; Mark A. Anastasio
We address the task of reconstructing images corrupted by Poisson noise, which is important in various applications such as fluorescence microscopy (Dey et al., 3D microscopy deconvolution using Richardson-Lucy algorithm with total variation regularization, 2004), positron emission tomography (PET; Vardi et al., J Am Stat Assoc 80:8–20, 1985), or astronomical imaging (Lanteri and Theys, EURASIP J Appl Signal Processing 15:2500–2513, 2005). Here we focus on reconstruction strategies combining the expectation-maximization (EM) algorithm and total variation (TV) based regularization, and present a detailed analysis as well as numerical results. Recently extensions of the well known EM/Richardson-Lucy algorithm received increasing attention for inverse problems with Poisson data (Dey et al., 3D microscopy deconvolution using Richardson-Lucy algorithm with total variation regularization, 2004; Jonsson et al., Total variation regularization in positron emission tomography, 1998; Panin et al., IEEE Trans Nucl Sci 46(6):2202–2210, 1999). However, most of these algorithms for regularizations like TV lead to convergence problems for large regularization parameters, cannot guarantee positivity, and rely on additional approximations (like smoothed TV). The goal of this lecture is to provide accurate, robust and fast EM-TV based methods for computing cartoon reconstructions facilitating post-segmentation and providing a basis for quantification techniques. We illustrate also the performance of the proposed algorithms and confirm the analytical concepts by 2D and 3D synthetic and real-world results in optical nanoscopy and PET.
VCBM | 2012
Daniel Tenbrinck; Alex Sawatzky; Xiaoyi Jiang; Martin Burger; Wladimir Haffner; Patrick Willems; Matthias Paul; Jörg Stypmann
We propose a method for reconstructing data from short time positron emission tomography (PET) scans, i.e data acquired over a short time period. In this case standard reconstruction methods deliver only unsatisfactory and noisy results. We incorporate a priori information directly in the reconstruction process via nonlinear variational methods. A promising approach was the so-called EMTV algorithm, where the negative log-likelihood functional, which is minimized in the expectation maximization (ML-EM) algorithm, was modified by adding a total variation (TV) term. To improve the results and to overcome the issue of the loss of contrast we extend the algorithm by an inverse scale space method using Bregman distances, to which we refer as BREGMAN EMTV algorithm. The methods are tested on short time (5 and 30 seconds) FDG measurements of the thorax. We can show that the EMTV approach can effectively reduce the noise, but still introduces an oversmoothing, which is eliminated by the BREGMAN EMTV method, obtaining a reconstruction of comparable quality to the corresponding long time (20 and 7 minutes) scan. This correction for the loss of contrast is necessary to obtain quantitative PET images.