Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Igor Vajda is active.

Publication


Featured researches published by Igor Vajda.


IEEE Transactions on Information Theory | 1999

Estimation of the information by an adaptive partitioning of the observation space

Georges A. Darbellay; Igor Vajda

We demonstrate that it is possible to approximate the mutual information arbitrarily closely in probability by calculating the relative frequencies on appropriate partitions and achieving conditional independence on the rectangles of which the partitions are made. Empirical results, including a comparison with maximum-likelihood estimators, are presented.


IEEE Transactions on Information Theory | 2006

On Divergences and Informations in Statistics and Information Theory

Friedrich Liese; Igor Vajda

The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence. All basic properties of f-divergences including relations to the decision errors are proved in a new manner replacing the classical Jensen inequality by a new generalized Taylor expansion of convex functions. Some new properties are proved too, e.g., relations to the statistical sufficiency and deficiency. The generalized Taylor expansion also shows very easily that all f-divergences are average statistical informations (differences between prior and posterior Bayes errors) mutually differing only in the weights imposed on various prior distributions. The statistical information introduced by De Groot and the classical information of Shannon are shown to be extremal cases corresponding to alpha=0 and alpha=1 in the class of the so-called Arimoto alpha-informations introduced in this paper for 0<alpha<1 by means of the Arimoto alpha-entropies. Some new examples of f-divergences are introduced as well, namely, the Shannon divergences and the Arimoto alpha-divergences leading for alphauarr1 to the Shannon divergences. Square roots of all these divergences are shown to be metrics satisfying the triangle inequality. The last section introduces statistical tests and estimators based on the minimal f-divergence with the empirical distribution achieved in the families of hypothetic distributions. For the Kullback divergence this leads to the classical likelihood ratio test and estimator


IEEE Transactions on Information Theory | 2000

Entropy expressions for multivariate continuous distributions

Georges A. Darbellay; Igor Vajda

Analytical formulas for the entropy and the mutual information of multivariate continuous probability distributions are presented.


IEEE Transactions on Information Theory | 1993

Convergence of best phi -entropy estimates

Marc Teboulle; Igor Vajda

Minimization problems involving phi -entropy functionals (a generalization of Boltzmann-Shannon entropy) are studied over a given set A and a sequence of sets A/sub n/ and the properties of their optimal solutions x/sub phi /, x/sub n/. Under certain conditions on the objective functional and the sets A and A/sub n/, it is proven that as n increases to infinity, the optimal solution x/sub n/ converges in L/sub 1/ norm to the test phi -entropy estimate x/sub phi /. >


Journal of Statistical Planning and Inference | 2001

Large deviations of divergence measures on partitions

Jan Beirlant; Luc Devroye; László Györfi; Igor Vajda

Abstract We discuss Chernoff-type large deviation results for the total variation, the I-divergence errors, and the χ2-divergence errors on partitions. In contrast to the total variation and the I-divergence, the χ2-divergence has an unconventional large deviation rate. Applications to Bahadur efficiencies of goodness-of-fit tests based on these divergence measures for multivariate observations are given.


IEEE Transactions on Information Theory | 1998

About the asymptotic accuracy of Barron density estimates

A. Berlinet; Igor Vajda; E.C. van der Meulen

By extending the information-theoretic arguments of previous papers dealing with the Barron-type density estimates, and their consistency in information divergence and chi-square divergence, the problem of consistency in Csiszars /spl phi/-divergence is motivated for general convex functions /spl phi/. The problem of consistency in /spl phi/-divergence is solved for all /spl phi/ with /spl phi/(0)</spl infin/ and /spl phi/(t)=O(t ln t) when t/spl rarr//spl infin/. The problem of consistency in the expected /spl phi/-divergence is solved for all /spl phi/ with t/spl phi/(1/t)+/spl phi/(t)=O(t/sup 2/) when t/spl rarr//spl infin/. Various stronger versions of these asymptotic restrictions are considered too. Assumptions about the model needed for the consistency are shown to depend on how strong these restrictions are.


systems man and cybernetics | 1996

Uncertainty of discrete stochastic systems: general theory and statistical inference

D. Morales; Leandro Pardo; Igor Vajda

Uncertainty is defined in a new manner, as a function of discrete probability distributions satisfying a simple and intuitively appealing weak monotonicity condition. It is shown that every uncertainty is Schur-concave and conversely, every Schur-concave function of distributions is an uncertainty. General properties of uncertainties are systematically studied. Many characteristics of distributions introduced previously in statistical physics, mathematical statistics, econometrics and information theory are shown to be particular examples of uncertainties. New examples are introduced, and traditional as well as some new methods for obtaining uncertainties are discussed. The information defined by decrease of uncertainty resulting from an observation is investigated and related to previous concepts of information. Further, statistical inference about uncertainties is investigated, based on independent observations of system states. In particular, asymptotic distributions of maximum likelihood estimates of uncertainties and uncertainty-related functions are derived, and asymptotically /spl alpha/-level Neyman-Pearson tests of hypotheses about these system characteristics are presented.


IEEE Transactions on Information Theory | 2003

On asymptotic properties of information-theoretic divergences

Mdel.C. Pardo; Igor Vajda

Mutual asymptotic equivalence is established within three classes of information-theoretic divergences of discrete probability distributions, namely, f-divergences of Csiszar, f-divergences of Bregman, and f-divergences of Burbea-Rao. These equivalences are used to find asymptotic distributions of the corresponding divergence statistics for testing the goodness of fit when the hypothetic distribution is uniform. All results are based on standard expansion techniques and on a new relation between the Bregman and Burbea-Rao divergences formulated in Lemma 2 of the paper.


IEEE Transactions on Information Theory | 1993

Statistical information and discrimination

Ferdinand Österreicher; Igor Vajda

In analogy with the definition of Shannon information, M.H. De Groot (1962) defined statistical information as the difference between prior and posterior risk of a statistical decision problem. Relations are studied between the statistical information and the discrimination functions of information theory known as f-divergences. Using previous results, it is shown that every f-divergence I/sub f/(P,Q) is an average statistical information or decision problem with dichotomic parameter, 0-1 loss function, and corresponding observation distributions P and Q. The average is taken over a distribution on the parameters prior probability. This distribution is uniquely determined by the function f. The main result is that every f-divergence is statistical information in some properly chosen statistical decision problem, and conversely, that every piece of statistical information is an f-divergence. This provides a new representation of discrimination functions figuring in signal detection, data compression, coding pattern classification, cluster analysis, etc. >


IEEE Transactions on Information Theory | 2012

On Bregman Distances and Divergences of Probability Measures

Wolfgang Stummer; Igor Vajda

This paper introduces scaled Bregman distances of probability distributions which admit nonuniform contributions of observed events. They are introduced in a general form covering not only the distances of discrete and continuous stochastic observations, but also the distances of random processes and signals. It is shown that the scaled Bregman distances extend not only the classical ones studied in the previous literature, but also the information divergence and the related wider class of convex divergences of probability measures. An information-processing theorem is established too, but only in the sense of invariance w.r.t. statistically sufficient transformations and not in the sense of universal monotonicity. Pathological situations where coding can increase the classical Bregman distance are illustrated by a concrete example. In addition to the classical areas of application of the Bregman distances and convex divergences such as recognition, classification, learning, and evaluation of proximity of various features and signals, the paper mentions a new application in 3-D exploratory data analysis. Explicit expressions for the scaled Bregman distances are obtained in general exponential families, with concrete applications in the binomial, Poisson, and Rayleigh families, and in the families of exponential processes such as the Poisson and diffusion processes including the classical examples of the Wiener process and geometric Brownian motion.

Collaboration


Dive into the Igor Vajda's collaboration.

Top Co-Authors

Avatar

Domingo Morales

Universidad Miguel Hernández de Elche

View shared research outputs
Top Co-Authors

Avatar

Leandro Pardo

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

E.C. van der Meulen

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

László Györfi

Budapest University of Technology and Economics

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tomáš Hobza

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

M. C. Pardo

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Wolfgang Stummer

University of Erlangen-Nuremberg

View shared research outputs
Researchain Logo
Decentralizing Knowledge