Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Leandro Pardo is active.

Publication


Featured researches published by Leandro Pardo.


Archive | 2005

Statistical inference based on divergence measures

Leandro Pardo

DIVERGENCE MEASURES: DEFINITION AND PROPERTIES Introduction Phi-divergence. Measures between Two Probability Distributions: Definition and Properties Other Divergence Measures between Two Probability Distributions Divergence among k Populations Phi-disparities Exercises Answers to Exercises ENTROPY AS A MEASURE OF DIVERSITY: SAMPLING DISTRIBUTIONS Introduction Phi-entropies. Asymptotic Distribution Testing and Confidence Intervals for Phi-entropies Multinomial Populations: Asymptotic Distributions Maximum Entropy Principle and Statistical Inference on Condensed Ordered Data Exercises Answers to Exercises GOODNESS-OF-FIT: SIMPLE NULL HYPOTHESIS Introduction Phi-divergences and Goodness-of-fit with Fixed Number of Classes Phi-divergence Test Statistics under Sparseness Assumptions Nonstandard Problems: Tests Statistics based on Phi-divergences Exercises Answers to Exercises OPTIMALITY OF PHI-DIVERGENCE TEST STATISTICS IN GOODNESS-OF-FIT Introduction Asymptotic Effciency Exact and Asymptotic Moments: Comparison A Second Order Approximation to the Exact Distribution Exact Powers Based on Exact Critical Regions Small Sample Comparisons for the Phi-divergence Test Statistics Exercises Answers to Exercises MINIMUM PHI-DIVERGENCE ESTIMATORS Introduction Maximum Likelihood and Minimum Phi-divergence Estimators Properties of the Minimum Phi-divergence Estimator Normal Mixtures: Minimum Phi-divergence Estimator Minimum Phi-divergence Estimator with Constraints: Properties Exercises Answers to Exercises GOODNESS-OF-FIT: COMPOSITE NULL HYPOTHESIS Introduction Asymptotic Distribution with Fixed Number of Classes Nonstandard Problems: Test Statistics Based on Phi-divergences Exercises Answers to Exercises Testing Loglinear Models Using Phi-divergence Test Statistics Introduction Loglinear Models: Definition Asymptotic Results for Minimum Phi-divergence Estimators in Loglinear Models Testing in Loglinear Models Simulation Study Exercises Answers to Exercises PHI-DIVERGENCE MEASURES IN CONTINGENCY TABLES Introduction Independence Symmetry Marginal Homogeneity Quasi-symmetry Homogeneity Exercises Answers to Exercises TESTING IN GENERAL POPULATIONS Introduction Simple Null Hypotheses: Wald, Rao, Wilks and Phi-divergence Test Statistics Composite Null Hypothesis Multi-sample Problem Some Topics in Multivariate Analysis Exercises Answers to Exercises References Index


Journal of Statistical Planning and Inference | 1995

Asymptotic divergence of estimates of discrete distributions

D. Morales; Leandro Pardo; I. Vajda

Abstract o-divergence D o ( P , Q ) of two estimates P and Q of a discrete distribution Pθ is shown to play an important role in mathematical statistics and information theory. If both P and Q are based on the same sample then special attention is paid to nonparameteric estimates P which are √n-consistent and parametric estimates Q = P \ gq defined by means of minimum o∗-divergence point estimates \ gq where o∗ need not be the same as o. Under a standard regularity these point estimates are shown to be efficient and the corresponding distribution estimates Q √n-consistent. But the asymptotics of D o ( P , Q ) is evaluated for arbitrary cn-consistent distribution stimates P and Q with cn → ∞. If the two distribution estimates are based on different samples then the asymptotics of D o ( P , Q ) is evaluated only for the above-mentioned special P and Q .


Communications in Statistics-theory and Methods | 1993

Asymptotic distribution of (h, φ)-entropies

Miquel Salicrú; M.L. Menéndez; D. Morales; Leandro Pardo

In this paper, (h,φ)-entropies are presented as a generalization of φ-entropies, Havrda-Charvat entropies and the Renyi entropy among others. For this functional, asymptotic distribution for simple random sampling and stratified .sampling with proportional affixing is obtained.


systems man and cybernetics | 1996

Uncertainty of discrete stochastic systems: general theory and statistical inference

D. Morales; Leandro Pardo; Igor Vajda

Uncertainty is defined in a new manner, as a function of discrete probability distributions satisfying a simple and intuitively appealing weak monotonicity condition. It is shown that every uncertainty is Schur-concave and conversely, every Schur-concave function of distributions is an uncertainty. General properties of uncertainties are systematically studied. Many characteristics of distributions introduced previously in statistical physics, mathematical statistics, econometrics and information theory are shown to be particular examples of uncertainties. New examples are introduced, and traditional as well as some new methods for obtaining uncertainties are discussed. The information defined by decrease of uncertainty resulting from an observation is investigated and related to previous concepts of information. Further, statistical inference about uncertainties is investigated, based on independent observations of system states. In particular, asymptotic distributions of maximum likelihood estimates of uncertainties and uncertainty-related functions are derived, and asymptotically /spl alpha/-level Neyman-Pearson tests of hypotheses about these system characteristics are presented.


Journal of Statistical Planning and Inference | 1994

Asymptotic properties of divergence statistics in a stratified random sampling and its applications to test statistical hypotheses

D. Morales; Leandro Pardo; Miquel Salicrú; M.L. Menéndez

Abstract In order to study the asymptotic properties of divergence statistics, we propose a unified expression, called ( h , o )-divergence, which includes as particular cases most divergences. Under different assumptions, it is shown that the asymptotic distributions of the ( h , o )-divergence statistics, in a stratified random sampling set up, are either normal or a linear form in chi square variables depending on whether or not suitable conditions are satisfied. The chi square and the likelihood ratio test statistics are particular cases of the ( h , o )-divergence test statistics considered. From the previous results, asymptotic distributions of entropy statistics are also derived. The problem of optimum allocation is studied and the relative precision of stratified and simple random sampling is analyzed. Applications to test statistical hypotheses in multinomial populations are given. To finish, appendices with the asymptotic variances of many well known divergence statistics are presented.


Kybernetes | 1997

Asymptotic approximations for the distributions of the (h‐, φ‐)‐divergence goodness‐of‐fit statistics: application to Renyi’s statistic

M.L. Menéndez; J.A. Pardo; Leandro Pardo; M. C. Pardo

Read (1984) presented an asymptotic expansion for the distribution function of the power divergence statistics whose speed of convergence is dependent on the parameter of the family. Generalizes that result by considering the family of (h, φ)‐divergence measures. Considers two other closer approximations to the exact distribution. Compares these three approximations for the Renyi’s statistic in small samples.


Statistical Papers | 1995

Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: a unified study

M.L. Menéndez; D. Morales; Leandro Pardo; Miquel Salicrú

Divergence measures play an important role in statistical theory, especially in large sample theories of estimation and testing. The underlying reason is that they are indices of statistical distance between probability distributions P and Q; the smaller these indices are the harder it is to discriminate between P and Q. Many divergence measures have been proposed since the publication of the paper of Kullback and Leibler (1951). Renyi (1961) gave the first generalization of Kullback-Leibler divergence, Jeffreys (1946) defined the J-divergences, Burbea and Rao (1982) introduced the R-divergences, Sharma and Mittal (1977) the (r,s)-divergences, Csiszar (1967) the ϕ-divergences, Taneja (1989) the generalized J-divergences and the generalized R-divergences and so on. In order to do a unified study of their statistical properties, here we propose a generalized divergence, called (h,ϕ)-divergence, which include as particular cases the above mentioned divergence measures. Under different assumptions, it is shown that the asymptotic distributions of the (h,ϕ)-divergence statistics are either normal or chi square. The chi square and the likelihood ratio test statistics are particular cases of the (h,ϕ)-divergence test statistics considered. From the previous results, asymptotic distributions of entropy statistics are derived too. Applications to testing statistical hypothesis in multinomial populations are given. The Pitman and Bahadur efficiencies of tests of goodness of fit and independence based on these statistics are obtained. To finish, apendices with the asymptotic variances of many well known divergence and entropy statistics are presented.


Communications in Statistics-theory and Methods | 1998

Two approaches to grouping of data and related disparity statistics

M.L. Menéndez; D. Morales; Leandro Pardo; I. Vajda

Csiszars φ-divergences of discrete distributions are extended to a more general class of disparity measures by restricting the convexity of functions φ(t), t > 0, to the local convexity at t = 1 and monotonicity on intervals (0, 1) and (l,∞). Goodness-of-fit estimation and testing procedures based on the (^-disparity statistics are introduced. Robustness of the estimation procedure is discussed and the asymptotic distributions for the testing procedure are established in statistical models with data grouped according to their values or orders


Statistics | 2004

Rényi statistics for testing composite hypotheses in general exponential models

Domingo Morales; Leandro Pardo; M. C. Pardo; Igor Vajda

We introduce a family of Rényi statistics of orders r ∈ R for testing composite hypotheses in general exponential models, as alternatives to the previously considered generalized likelihood ratio (GLR) statistic and generalized Wald statistic. If appropriately normalized exponential models converge in a specific sense when the sample size (observation window) tends to infinity, and if the hypothesis is regular, then these statistics are shown to be χ2-distributed under the hypothesis. The corresponding Rényi tests are shown to be consistent. The exact sizes and powers of asymptotically α-size Rényi, GLR and generalized Wald tests are evaluated for a concrete hypothesis about a bivariate Lévy process and moderate observation windows. In this concrete situation the exact sizes of the Rényi test of the order r = 2 practically coincide with those of the GLR and generalized Wald tests but the exact powers of the Rényi test are on average somewhat better.


Statistics | 2000

Rényi Statistics in Directed Families of Exponential Experiments

Domingo Morales; Leandro Pardo; I. Vajda

Rényi statistics are considered in a directed family of general exponential models. These statistics are defined as Rényi distances between estimated and hypothetical model. An asymptotically quadratic approximation to the Rényi statistics is established, leading to similar asymptotic distribution results as established in the literature for the likelihood ratio statistics. Some arguments in favour of the Rényi statistics are discussed, and a numerical comparison of the Rényi goodness-of-fit tests with the likelihood ratio test is presented.

Collaboration


Dive into the Leandro Pardo's collaboration.

Top Co-Authors

Avatar

M.L. Menéndez

Technical University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Nirian Martín

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

D. Morales

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Domingo Morales

Universidad Miguel Hernández de Elche

View shared research outputs
Top Co-Authors

Avatar

J.A. Pardo

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

M. C. Pardo

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Ayanendranath Basu

Indian Statistical Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

K. Zografos

University of Ioannina

View shared research outputs
Top Co-Authors

Avatar

Abhijit Mandal

Indian Statistical Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge