Riccardo Zanella
University of Ferrara
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Riccardo Zanella.
Inverse Problems | 2009
Silvia Bonettini; Riccardo Zanella; Luca Zanni
A class of scaled gradient projection methods for optimization problems with simple constraints is considered. These iterative algorithms can be useful in variational approaches to image deblurring that lead to minimized convex nonlinear functions subject to non-negativity constraints and, in some cases, to an additional flux conservation constraint. A special gradient projection method is introduced that exploits effective scaling strategies and steplength updating rules, appropriately designed for improving the convergence rate. We give convergence results for this scheme and we evaluate its effectiveness by means of an extensive computational study on the minimization problems arising from the maximum likelihood approach to image deblurring. Comparisons with the standard expectation maximization algorithm and with other iterative regularization schemes are also reported to show the computational gain provided by the proposed method.
Inverse Problems | 2009
Riccardo Zanella; Patrizia Boccacci; Luca Zanni; M. Bertero
Several methods based on different image models have been proposed and developed for image denoising. Some of them, such as total variation (TV) and wavelet thresholding, are based on the assumption of additive Gaussian noise. Recently the TV approach has been extended to the case of Poisson noise, a model describing the effect of photon counting in applications such as emission tomography, microscopy and astronomy. For the removal of this kind of noise we consider an approach based on a constrained optimization problem, with an objective function describing TV and other edge-preserving regularizations of the Kullback–Leibler divergence. We introduce a new discrepancy principle for the choice of the regularization parameter, which is justified by the statistical properties of the Poisson noise. For solving the optimization problem we propose a particular form of a general scaled gradient projection (SGP) method, recently introduced for image deblurring. We derive the form of the scaling from a decomposition of the gradient of the regularization functional into a positive and a negative part. The beneficial effect of the scaling is proved by means of numerical simulations, showing that the performance of the proposed form of SGP is superior to that of the most efficient gradient projection methods. An extended numerical analysis of the dependence of the solution on the regularization parameter is also performed to test the effectiveness of the proposed discrepancy principle.
Inverse Problems | 2010
M. Bertero; Patrizia Boccacci; G. Talenti; Riccardo Zanella; Luca Zanni
In applications of imaging science, such as emission tomography, fluorescence microscopy and optical/infrared astronomy, image intensity is measured via the counting of incident particles (photons, γ-rays, etc). Fluctuations in the emission-counting process can be described by modeling the data as realizations of Poisson random variables (Poisson data). A maximum-likelihood approach for image reconstruction from Poisson data was proposed in the mid-1980s. Since the consequent maximization problem is, in general, ill-conditioned, various kinds of regularizations were introduced in the framework of the so-called Bayesian paradigm. A modification of the well-known Tikhonov regularization strategy results in the data-fidelity function being a generalized Kullback–Leibler divergence. Then a relevant issue is to find rules for selecting a proper value of the regularization parameter. In this paper we propose a criterion, nicknamed discrepancy principle for Poisson data, that applies to both denoising and deblurring problems and fits quite naturally the statistical properties of the data. The main purpose of the paper is to establish conditions, on the data and the imaging matrix, ensuring that the proposed criterion does actually provide a unique value of the regularization parameter for various classes of regularization functions. A few numerical experiments are performed to demonstrate its effectiveness. More extensive numerical analysis and comparison with other proposed criteria will be the object of future work.
Scientific Reports | 2013
Riccardo Zanella; Gaetano Zanghirati; Roberto Cavicchioli; Luca Zanni; Patrizia Boccacci; M. Bertero; Giuseppe Vicidomini
Although deconvolution can improve the quality of any type of microscope, the high computational time required has so far limited its massive spreading. Here we demonstrate the ability of the scaled-gradient-projection (SGP) method to provide accelerated versions of the most used algorithms in microscopy. To achieve further increases in efficiency, we also consider implementations on graphic processing units (GPUs). We test the proposed algorithms both on synthetic and real data of confocal and STED microscopy. Combining the SGP method with the GPU implementation we achieve a speed-up factor from about a factor 25 to 690 (with respect the conventional algorithm). The excellent results obtained on STED microscopy images demonstrate the synergy between super-resolution techniques and image-deconvolution. Further, the real-time processing allows conserving one of the most important property of STED microscopy, i.e the ability to provide fast sub-diffraction resolution recordings.
Genome Medicine | 2014
Marco Galasso; Paola Dama; Maurizio Previati; Sukhinder K. Sandhu; Jeff Palatini; Vincenzo Coppola; Sarah Warner; Maria Elena Sana; Riccardo Zanella; Ramzey Abujarour; Caroline Desponts; Michael A. Teitell; Ramiro Garzon; George A. Calin; Carlo M. Croce; Stefano Volinia
BackgroundThere are 481 ultra-conserved regions (UCRs) longer than 200 bases in the genomes of human, mouse and rat. These DNA sequences are absolutely conserved and show 100% identity with no insertions or deletions. About half of these UCRs are reported as transcribed and many correspond to long non-coding RNAs (lncRNAs).MethodsWe used custom microarrays with 962 probes representing sense and antisense sequences for the 481 UCRs to examine their expression across 374 normal samples from 46 different tissues and 510 samples representing 10 different types of cancer. The expression in embryonic stem cells of selected UCRs was validated by real time PCR.ResultsWe identified tissue selective UCRs and studied UCRs in embryonic and induced pluripotent stem cells. Among the normal tissues, the uc.283 lncRNA was highly specific for pluripotent stem cells. Intriguingly, the uc.283-plus lncRNA was highly expressed in some solid cancers, particularly in one of the most untreatable types, glioma.ConclusionOur results suggest that uc.283-plus lncRNA might have a role in pluripotency of stem cells and in the biology of glioma.
Journal of Global Optimization | 2010
Valeria Ruggiero; Thomas Serafini; Riccardo Zanella; Luca Zanni
The ability of the modern graphics processors to operate on large matrices in parallel can be exploited for solving constrained image deblurring problems in a short time. In particular, in this paper we propose the parallel implementation of two iterative regularization methods: the well known expectation maximization algorithm and a recent scaled gradient projection method. The main differences between the considered approaches and their impact on the parallel implementations are discussed. The effectiveness of the parallel schemes and the speedups over standard CPU implementations are evaluated on test problems arising from astronomical images.
New Journal of Physics | 2013
Vincenzo Grillo; Lorenzo Marrucci; Ebrahim Karimi; Riccardo Zanella; Enrico Santamato; Centro S
A proposal for an electron-beam device that can act as an efficient spin-polarization filter has been recently put forward (Karimi et al 2012 Phys. Rev. Lett. 108 044801). It is based on combining the recently developed diffraction technology for imposing orbital angular momentum to the beam with a multipolar Wien filter inducing a sort of artificial non-relativistic spin-orbit coupling. Here we reconsider the proposed device with a fully quantum- mechanical simulation of the electron-beam propagation, based on the well- established multi-slice method, supplemented with a Pauli term for taking into account the spin degree of freedom. Using this upgraded numerical tool, we study the feasibility and practical limitations of the proposed method for spin polarizing a free electron beam.
Inverse Problems | 2013
Riccardo Zanella; Patrizia Boccacci; Luca Zanni; M. Bertero
This file contains a complete proof of lemma 1 of the paper. (Some figures may appear in colour only in the online journal) The proof of lemma 1, given in appendix A of the paper [1] is incomplete because the upper bound (A.10) for e4(ξ ) is correct only for ξ 0 and therefore it provides an upper bound on the positive part of the expected value of the residual. No estimate of the negative part is given. Moreover, an attempt of completing the proof has shown that the approach proposed in [1] can be simplified. Since the lemma is the basis of a discrepancy principle for the reconstruction of Poisson data, introduced by the authors in that and subsequent papers [2, 3], we believe that it may be useful to provide a complete proof. Lemma 1.LetYλ be aPoisson r.v.with expected value λand consider the function ofYλ defined by Fλ(Yλ) = 2 Yλln Yλ λ + λ −Yλ .
Applied Mathematics and Computation | 2018
Riccardo Zanella; Federica Porta; Valeria Ruggiero; Massimo Zanetti
Abstract Because of its attractive features, second order segmentation has shown to be a promising tool in remote sensing. A known drawback about its implementation is computational complexity, above all for large set of data. Recently in Zanetti et al. [1], an efficient version of the block-coordinate descent algorithm (BCDA) has been proposed for the minimization of a second order elliptic approximation of the Blake–Zissermann functional. Although the parallelization of linear algebra operations is expected to increase the performance of BCDA when addressing the segmentation of large-size gridded data (e.g., full-scene images or Digital Surface Models (DSMs)), numerical evidence shows that this is not sufficient to get significant reduction of computational time. Therefore a novel approach is proposed which exploits a decomposition technique of the image domain into tiles. The solution can be computed by applying BCDA on each tile in parallel way and combining the partial results corresponding to the different blocks of variables through a proper interconnection rule. We prove that this parallel method (OPARBCDA) generates a sequence of iterates which converges to a critical point of the functional on the level set devised by the starting point. Furthermore, we show that the parallel method can be efficiently implemented even in a commodity multicore CPU. Numerical results are provided to evaluate the efficiency of the parallel scheme on large images in terms of computational cost and its effectiveness with respect to the behavior on the tile junctions.
international geoscience and remote sensing symposium | 2016
Massimo Zanetti; Riccardo Zanella; Lorenzo Bruzzone
Typical tiling approaches to segmentation of large images perform separated runs of a specific segmentation algorithm on tiles and then merge the results. However, specific post-processing is often required to remove possible artifacts on tiles junctions. In this paper, we aim at showing that a simple tiling strategy with partially overlapping tiles can be applied to a 2-nd order variational segmentation method based on the minimization of the Blake-Zisserman functional, in such a way that tile boundaries are coherent without any need of specific post-processing. Moreover, the energy minimization is performed on each tile with Dirichlet initial boundary conditions; thus, tiles are independent and the whole procedure is parallelizable with independent tiles.