Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ion Grama is active.

Publication


Featured researches published by Ion Grama.


Stochastic Processes and their Applications | 2013

Cramér large deviation expansions for martingales under Bernstein's condition

Xiequan Fan; Ion Grama; Quansheng Liu

An expansion of large deviation probabilities for martingales is given, which extends the classical result due to Cramer to the case of martingale differences satisfying the conditional Bernstein condition. The upper bound of the range of validity and the remainder of our expansion is the same as in the Cramer result and therefore are optimal. Our result implies a moderate deviation principle for martingales.


Journal of Scientific Computing | 2014

A New Poisson Noise Filter Based on Weights Optimization

Qiyu Jin; Ion Grama; Quansheng Liu

We propose a new image denoising algorithm when the data is contaminated by a Poisson noise. As in the Non-Local Means filter, the proposed algorithm is based on a weighted linear combination of the observed image. But in contrast to the latter where the weights are defined by a Gaussian kernel, we propose to choose them in an optimal way. First some “oracle” weights are defined by minimizing a very tight upper bound of the Mean Square Error. For a practical application the weights are estimated from the observed image. We prove that the proposed filter converges at the usual optimal rate to the true image. Simulation results are presented to compare the performance of the presented filter with conventional filtering methods.


Siam Journal on Imaging Sciences | 2017

Nonlocal Means and Optimal Weights for Noise Removal

Qiyu Jin; Ion Grama; Charles Kervrann; Quansheng Liu

In this paper, a new denoising algorithm to deal with the additive white Gaussian noise model is described. Following the nonlocal (NL) means approach, we propose an adaptive estimator based on the weighted average of observations taken in a neighborhood with weights depending on the similarity of local patches. The idea is to compute adaptive weights that best minimize an upper bound of the pointwise


Annales De L Institut Henri Poincare-probabilites Et Statistiques | 2002

A FUNCTIONAL HUNGARIAN CONSTRUCTION FOR SUMS OF INDEPENDENT RANDOM VARIABLES UNE CONSTRUCTION HONGROISE POUR DES SOMMES DE VARIABLES ALÉATOIRES INDÉPENDANTES

Ion Grama; Michael Nussbaum

L_2


PLOS ONE | 2017

Optimal Weights Mixed Filter for Removing Mixture of Gaussian and Impulse Noises

Qiyu Jin; Ion Grama; Quansheng Liu

risk. In the framework of adaptive estimation, we show that the “oracle” weights are optimal if we consider triangular kernels instead of the commonly used Gaussian kernel. Furthermore, we propose a way to automatically choose the spatially varying smoothing parameter for adaptive denoising. Under conventional minimal regularity conditions, the obtained estimator converges at the usual optimal rate. The implementation of the proposed algorithm is also straightforward and the simulations show that our algorithm significantly improves the classical NL means and is competitive when compared to the more so...


international conference on neural information processing | 2014

A New Method for Removing Random-Valued Impulse Noise

Qiyu Jin; Li Bai; Jie Yang; Ion Grama; Quansheng Liu

Abstract We develop a Hungarian construction for the partial sum process of independent non-identically distributed random variables. The process is indexed by functions f from a class H , but the supremum over f∈ H is taken outside the probability. This form is a prerequisite for the Komlos–Major–Tusnady inequality in the space of bounded functionals l ∞ ( H ) , but contrary to the latter it essentially preserves the classical n−1/2logn approximation rate over large functional classes H such as the Holder ball of smoothness 1/2. This specific form of a strong approximation is useful for proving asymptotic equivalence of statistical experiments.


Statistics | 2017

Nonuniform Berry-Esseen bounds for martingales with applications to statistical estimation

Xiequan Fan; Ion Grama; Quansheng Liu

In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise.


Statistics | 2017

Martingale inequalities of type Dzhaparidze and van Zanten

Xiequan Fan; Ion Grama; Quansheng Liu

A new algorithm for removing random-valued impulse noise is proposed. We use a standardized version of the Rank Ordered Absolute Differences statistic of Garnett et al. [1] to attribute weights to noisy pixels. These weights are then incorporated into the Optimal Weights Filter approach from [2,3] to construct a new filter. Simulation results show that our method performs significantly better than a number of existing techniques.


The Imaging Science Journal | 2016

Controlled total variation regularization for image deconvolution

Qiyu Jin; Ion Grama; Quansheng Liu

ABSTRACT We establish non-uniform Berry–Esseen bounds for martingales under the conditional Bernstein condition. These bounds imply Cramér type large deviations for moderate xs, and are of exponential decay rate as de la Peñas inequality when . Statistical applications associated with linear regressions and self-normalized large deviations are also provided.


International Conference on Modern Problems of Stochastic Analysis and Statistics | 2016

BOUNDS IN THE LOCAL LIMIT THEOREM FOR A RANDOM WALK CONDITIONED TO STAY POSITIVE

Ion Grama; Emile Le Page

ABSTRACT Freedmans inequality is a supermartingale counterpart to Bennetts inequality. This result shows that the tail probabilities of a supermartingale is controlled by the quadratic characteristic and a uniform upper bound for the supermartingale difference sequence. Replacing the quadratic characteristic by , Dzhaparidze and van Zanten [On Bernstein-type inequalities for martingales. Stoch Process Appl. 2001;93:109–117] have extended Freedmans inequality to martingales with unbounded differences. In this paper, we prove that can be refined to . Moreover, we also establish two inequalities of type Dzhaparidze and van Zanten. These results extend Sasons inequality [Tightened exponential bounds for discrete-time conditionally symmetric martingales with bounded jumps. Statist Probab Lett. 2013;83:1928–1936] to martingales with possibly unbounded differences and establish the connection between Sasons inequality and De la Peñas inequality [A general class of exponential inequalities for martingales and ratios. Ann Probab. 1999;27(1):537–564]. An application to self-normalized deviations is given.

Collaboration


Dive into the Ion Grama's collaboration.

Top Co-Authors

Avatar

Quansheng Liu

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Qiyu Jin

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Emile Le Page

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Eric Miqueu

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jie Yang

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar

Quansheng Liu

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Li Bai

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar

Marc Peigné

François Rabelais University

View shared research outputs
Researchain Logo
Decentralizing Knowledge