Yves-Caoimhin Swan
University of Liège
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yves-Caoimhin Swan.
IEEE Transactions on Information Theory | 2013
Christophe Ley; Yves-Caoimhin Swan
Pinskers inequality states that the relative entropy between two random variables X and Y dominates the square of the total variation distance between X and Y. In this paper, we introduce generalized Fisher information distances and prove that these also dominate the square of the total variation distance. To this end, we introduce a general discrete Stein operator for which we prove a useful covariance identity. We illustrate our approach with several examples. Whenever competitor inequalities are available in the literature, the constants in ours are at least as good, and, in several cases, better.
Brazilian Journal of Probability and Statistics | 2016
Christophe Ley; Yves-Caoimhin Swan
Stein operators are (differential/difference) operators which arise within the so-called Steins method for stochastic approximation. We propose a new mechanism for constructing such operators for arbitrary (continuous or discrete) parametric distributions with continuous dependence on the parameter. We provide explicit general expressions for location, scale and skewness families. We also provide a general expression for discrete distributions. We use properties of our operators to provide upper and lower variance bounds (only lower bounds in the discrete case) on functionals h(X) of random variables X following parametric distributions. These bounds are expressed in terms of the first two moments of the derivatives (or differences) of h. We provide general variance bounds for location, scale and skewness families and apply our bounds to specific examples (namely the Gaussian, exponential, gamma and Poisson distributions). The results obtained via our techniques are systematically competitive with, and sometimes improve on, the best bounds available in the literature.
Bernoulli | 2014
Mitia Duerinckx; Christophe Ley; Yves-Caoimhin Swan
A famous characterization theorem due to C. F. Gauss states that the maximum likelihood estimator (MLE) of the parameter in a lo- cation family is the sample mean for all samples of all sample sizes if and only if the family is Gaussian. There exist many extensions of this result in diverse directions, most of them focussing on location and scale families. In this paper we propose a unified treatment of this literature by providing general MLE characterization theorems for one-parameter group families (with particular attention on location and scale parameters). In doing so we provide tools for determining whether or not a given such family is MLE-characterizable, and, in case it is, we define the fundamental concept of minimal necessary sample size at which a given characterization holds. Many of the cornerstone references on this topic are retrieved and discussed in the light of our findings, and several new characterization theorems are provided. Of particular interest is that one part of our work, namely the introduction of so-called equivalence classes for MLE characterizations, is a modernized version of Daniel Bernoulli’s viewpoint on maximum likelihood estimation.
Periodica Mathematica Hungarica | 2013
Siegfried Hörmann; Yves-Caoimhin Swan
AbstractLet X = {Xn}n≥1 and Y = {Yn}n≥1 be two independent random sequences. We obtain rates of convergence to the normal law of randomly weighted self-normalized sums
Electronic Communications in Probability | 2013
Christophe Ley; Yves-Caoimhin Swan
arXiv: Probability | 2014
Christophe Ley; Gesine Reinert; Yves-Caoimhin Swan
\psi _n \left( {X,Y} \right) = \sum\nolimits_{i = 1}^n {{{X_i Y_i } \mathord{\left/ {\vphantom {{X_i Y_i } {V_n , V_n }}} \right. \kern-\nulldelimiterspace} {V_n , V_n }}} = \sqrt {Y_1^2 + \cdots + Y_n^2 } .
Statistica Sinica | 2013
Christophe Ley; Yves-Caoimhin Swan; Baba Thiam; Thomas Verdebout
Journal of Applied Probability | 2009
F. Thomas Bruss; Yves-Caoimhin Swan
. These rates are seen to hold for the convergence of a number of important statistics, such as for instance Student’s t-statistic or the empirical correlation coefficient.
Annals of the Institute of Statistical Mathematics | 2017
Christophe Ley; Yves-Caoimhin Swan; Thomas Verdebout
arXiv: Probability | 2011
Christophe Ley; Yves-Caoimhin Swan