Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christophe Ley is active.

Publication


Featured researches published by Christophe Ley.


Archive | 2017

Modern Directional Statistics

Christophe Ley; Thomas Verdebout

Introduction Overview Directional data sets Basics and notations Plan of the book Advances in flexible parametric distribution theory Introduction Flexible circular distributions Flexible spherical distributions Flexible toroidal and cylindrical distributions Further reading Advances in kernel density estimation on directional supports Introduction Definitions and main properties A delicate yet crucial issue: bandwidth choice Bandwidth selection in the cylindrical setting Inferential procedures Further reading Computational and graphical methods Ordering data on the sphere: quantiles and depth functions Statistical inference under order restrictions on the circle Computationally fast estimation for high-dimensional FvML distributions New (high -dimensional) approximations for the concentration parameter Further reading Local asymptotic normality for directional data Introduction Local asymptotic normality and optimal testing LAN for directional data Further reading Recent results for tests of uniformity and symmetry Introduction Recent advances concerning the Rayleigh test of uniformity Sobolev tests of uniformity Uniformity tests based on random projections Testing for uniformity with noisy data Tests of reflective symmetry on the circle Tests of rotational symmetry on hyperspheres Testing for spherical location in the vicinity of the uniform distribution Further reading High-dimensional directional statistics Introduction Distributions on high-dimensional spheres Testing uniformity in the high-dimensional case Location tests in the high-dimensional case Concentration tests in the high-dimensional case Principal nested spheres Further reading


IEEE Transactions on Information Theory | 2013

Local Pinsker Inequalities via Stein's Discrete Density Approach

Christophe Ley; Yves-Caoimhin Swan

Pinskers inequality states that the relative entropy between two random variables X and Y dominates the square of the total variation distance between X and Y. In this paper, we introduce generalized Fisher information distances and prove that these also dominate the square of the total variation distance. To this end, we introduce a general discrete Stein operator for which we prove a useful covariance identity. We illustrate our approach with several examples. Whenever competitor inequalities are available in the literature, the constants in ours are at least as good, and, in several cases, better.


Bernoulli | 2012

Skew-symmetric distributions and Fisher information: a tale of two densities

Marc Hallin; Christophe Ley

Skew-symmetric densities recently received much attention in the literature, giving rise to increasingly general families of univariate and multivariate skewed densities. Most of those families, however, suffer from the inferential drawback of a potentially singular Fisher information in the vicinity of symmetry. All existing results indicate that Gaussian densities (possibly after restriction to some linear subspace) play a special and somewhat intriguing role in that context. We dispel that widespread opinion by providing a full characterization, in a general multivariate context, of the information singularity phenomenon, highlighting its relation to a possible link between symmetric kernels and skewing functions - a link that can be interpreted as the mismatch of two densities.


Journal of Multivariate Analysis | 2010

On the singularity of multivariate skew-symmetric models

Christophe Ley; Davy Paindaveine

In recent years, the skew-normal models introduced by Azzalini (1985)—and their multivariate generalizations from Azzalini and Dalla Valle (1996)—have enjoyed an amazing success, although an important literature has reported that they exhibit, in the vicinity of symmetry, singular Fisher information matrices and stationary points in the profile log-likelihood function for skewness, with the usual unpleasant consequences for inference. It has been shown (DiCiccio and Monti 2004, 2009) that these singularities, in some specific parametric extensions of skew-normal models (such as the classes of skew- exponential or skew-t distributions), appear at skew-normal distributions only. Yet, an important question remains open: in broader semiparametric models of skewed distributions (such as the general skew-symmetric and skew-elliptical ones), which symmetric kernels lead to such singularities? In this talk, we provide an answer to this question. In very general (possibly multivariate) skew-symmetric models (see Ma and Genton 2004), we characterize, for each possible value of the rank of Fisher information matrices, the class of symmetric kernels achieving the corresponding rank. Our results show that, for strictly multivariate skew-symmetric models, not only Gaussian kernels yield singular Fisher information matrices. In contrast, we prove that systematic stationary points in the profile log-likelihood functions are obtained for (multi)normal kernels only. Finally, we also discuss the implications of such singularities on inference. References: A. Azzalini, A class of distributions which includes the normal ones, Scand. J. Statist 12 (1985) 171-178. A. Azzalini, A. Dalla Valle, The multivariate skew-normal distribution, Biometrika 83 (1996) 715-726. T.J. DiCiccio, A.C. Monti, Inferential aspects of the skew exponential power distribution, J. Amer. Statist. Assoc. 99 (2004) 439-450. T.J. DiCiccio, A.C. Monti, Inferential aspects of the skew t- distribution (2009). Manuscript in preparation. Y. Ma, M.G. Genton, Flexible class of skew-symmetric distributions, Scand. J. Statist. 31 (2004) 459-468.


Probability Surveys | 2017

Stein’s method for comparison of univariate distributions

Christophe Ley; Gesine Reinert; Yvik Swan

We propose a new general version of Steins method for univariate distributions. In particular we propose a canonical definition of the Stein operator of a probability distribution {which is based on a linear difference or differential-type operator}. The resulting Stein identity highlights the unifying theme behind the literature on Steins method (both for continuous and discrete distributions). Viewing the Stein operator as an operator acting on pairs of functions, we provide an extensive toolkit for distributional comparisons. Several abstract approximation theorems are provided. Our approach is illustrated for comparison of several pairs of distributions : normal vs normal, sums of independent Rademacher vs normal, normal vs Student, and maximum of random variables vs exponential, Frechet and Gumbel.


Journal of Nonparametric Statistics | 2009

Le Cam optimal tests for symmetry against Ferreira and Steel's general skewed distributions

Christophe Ley; Davy Paindaveine

When testing symmetry of a univariate density, (parametric classes of) densities skewed by means of the general probability transform introduced in Ferreira and Steel [A constructive representation of univariate skewed distributions, J. Amer. Statist. Assoc. 101 (2006), pp. 823–829] are appealing alternatives. This paper first proposes parametric tests of symmetry (about a specified centre) that are locally and asymptotically optimal (in the Le Cam sense) against such alternatives. To improve on these parametric tests, which are valid under well-specified density types only, we turn them into semiparametric tests, either by using a standard studentisation approach or by resorting to the invariance principle. The second approach leads to robust yet efficient signed-rank tests, which include the celebrated sign and Wilcoxon tests as special cases, and turn out to be Le Cam optimal irrespective of the underlying original symmetric density. Optimality, however, is only achieved under well-specified ‘skewing mechanisms’, and we therefore evaluate the overall performances of our tests by deriving their asymptotic relative efficiencies with respect to the classical test of skewness. A Monte-Carlo study confirms the asymptotic results.


Bernoulli | 2014

Skew-Symmetric Distributions and Fisher Information the double sin of the skew-normal

Marc Hallin; Christophe Ley

Hallin and Ley [Bernoulli 18 (2012) 747–763] investigate and fully characterize the Fisher singularity phenomenon in univariate and multivariate families of skew-symmetric distributions. This paper proposes a refined analysis of the (univariate) problem, showing that singularity can be more or less severe, inducing n 1/4 (“simple singularity”), n 1/6 (“double singularity”), or n 1/8 (“triple singularity”) consistency rates for the skewness parameter. We show, however, that simple singularity (yielding n 1/4 consistency rates), if any singularity at all, is the rule, in the sense that double and triple singularities are possible for generalized skew-normal families only. We also show that higher-order singularities, leading to worse-than-n 1/8 rates, cannot occur. Depending on the degree of the singularity, our analysis also suggests a simple reparametrization that offers an alternative to the so-called centred parametrization proposed, in the particular case of skew-normal and skew-t families, by Azzalini [Scand. J. Stat. 12 (1985) 171–178], Arellano-Valle and Azzalini [J. Multivariate Anal. 113 (2013) 73–90], and DiCiccio and Monti [Quaderni di Statistica 13 (2011) 1–21], respectively.


Metron-International Journal of Statistics | 2010

On Fisher information matrices and profile log-likelihood functions in generalized skew-elliptical models

Christophe Ley; Davy Paindaveine

SummaryIn recent years, the skew-normal models introduced in Azzalini (1985) have enjoyed an amazing success, although an important literature has reported that they exhibit, in the vicinity of symmetry, singular Fisher information matrices and stationary points in the profile log-likelihood function for skewness, with the usual unpleasant consequences for inference. For general multivariate skew-symmetric and skew-elliptical models, the open problem of determining which symmetric kernels lead to each such singularity has been solved in Ley and Paindaveine (2010). In the present paper, we provide a simple proof that, in generalized skew-elliptical models involving the same skewing scheme as in the skew-normal distributions, Fisher information matrices, in the vicinity of symmetry, are singular for Gaussian kernels only. Then we show that if the profile log-likelihood function for skewness always has a point of inflection in the vicinity of symmetry, the generalized skew-elliptical distribution considered is actually skew-(multi)normal. In addition, we show that the class of multivariate skew-t distributions (as defined in Azzalini and Capitanio 2003), which was not covered by Ley and Paindaveine (2010), does not suffer from singular Fisher information matrices in the vicinity of symmetry. Finally, we briefly discuss the implications of our results on inference.


Probability Surveys | 2014

Characterizations of GIG laws: A survey

Angelo Efoévi Koudou; Christophe Ley

Several characterizations of the Generalized Inverse Gaussian (GIG) distribution on the positive real line have been proposed in the literature, especially over the past two decades. These characterization theorems are surveyed, and two new characterizations are established, one based on maximum likelihood estimation and the other is a Stein characterization.


Brazilian Journal of Probability and Statistics | 2016

Parametric Stein operators and variance bounds

Christophe Ley; Yves-Caoimhin Swan

Stein operators are (differential/difference) operators which arise within the so-called Steins method for stochastic approximation. We propose a new mechanism for constructing such operators for arbitrary (continuous or discrete) parametric distributions with continuous dependence on the parameter. We provide explicit general expressions for location, scale and skewness families. We also provide a general expression for discrete distributions. We use properties of our operators to provide upper and lower variance bounds (only lower bounds in the discrete case) on functionals h(X) of random variables X following parametric distributions. These bounds are expressed in terms of the first two moments of the derivatives (or differences) of h. We provide general variance bounds for location, scale and skewness families and apply our bounds to specific examples (namely the Gaussian, exponential, gamma and Poisson distributions). The results obtained via our techniques are systematically competitive with, and sometimes improve on, the best bounds available in the literature.

Collaboration


Dive into the Christophe Ley's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Davy Paindaveine

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar

Yvik Swan

University of Luxembourg

View shared research outputs
Top Co-Authors

Avatar

Yves Dominicy

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marc Hallin

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar

Anouk Neven

University of Luxembourg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge