Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marian Grendar is active.

Publication


Featured researches published by Marian Grendar.


arXiv: Mathematical Physics | 2001

What is the question that MaxEnt answers? A probabilsitic interpretation

Marian Grendar

The Boltzmann-Wallis-Jaynes’ multiplicity argument is taken up and elaborated. MaxEnt is proved and demonstrated to be just an asymptotic case of looking for such a vector of absolute frequencies in a feasible set, which has maximal probability of being generated by a uniform prior generator/pmf.


Electronic Journal of Statistics | 2009

Empty Set Problem of Maximum Empirical Likelihood Methods

Marian Grendar; George G. Judge

In an influential work, Qin and Lawless (1994) proposed a general estimating equations (GEE) formulation for maximum empirical likelihood (MEL) estimation and inference. The formulation replaces a model specified by GEE with a set of data-supported probability mass functions that satisfy empirical estimating equations (E3). In this paper we use several examples from the literature to demonstrate that the set may be empty for some E3 models and finite data samples. As a result, MEL does not exist for such models and data sets. If MEL and other E3-based methods are to be used, then models will have to be checked on case-by-case basis for the absence or presence of the empty set problem.


Physica A-statistical Mechanics and Its Applications | 2007

An Empirical Non-Parametric Likelihood Family of Data-Based Benford-Like Distributions

Marian Grendar; George G. Judge; Laura Schechter

A mathematical expression known as Benfords law provides an example of an unexpected relationship among randomly selected first significant digits (FSD). Newcomb (1881), and later Benford (1938), conjectured that FSDs would exhibit a weakly monotonic distribution and proposed a frequency proportional to the logarithmic rule. Unfortunately, the Benford FSD function does not hold for a wide range of scale-invariant multiplicative data. To confront this problem we use information-theoretic methods to develop a data-based family of Benford-like exponential distributions that provide null hypotheses for testing purposes. Two data sets are used to illustrate the performance of generalized Benford-like distributions.


Annals of Statistics | 2009

ASYMPTOTIC EQUIVALENCE OF EMPIRICAL LIKELIHOOD AND BAYESIAN MAP

Marian Grendar; George G. Judge

In this paper we are interested in empirical likelihood (EL) as a method of estimation, and we address the following two problems: (1) selecting among various empirical discrepancies in an EL framework and (2) demonstrating that EL has a well-defined probabilistic interpretation that would justify its use in a Bayesian context. Using the large deviations approach, a Bayesian law of large numbers is developed that implies that EL and the Bayesian maximum a posteriori probability (MAP) estimators are consistent under misspecification and that EL can be viewed as an asymptotic form of MAP. Estimators based on other empirical discrepancies are, in general, inconsistent under misspecification.


Econometric Reviews | 2008

Large Deviations Theory and Empirical Estimator Choice

Marian Grendar; George G. Judge

In this article, we consider the problem of criterion choice in information recovery and inference in a large-deviations (LD) context. Kitamura and Stutzer recognize that the Maximum Entropy Empirical Likelihood estimator can be given a LD justification (Kitamura and Stutzer, 2002). We demonstrate there exists a similar LD justification for Owens Empirical Likelihood estimator (Owen, 2001). We tie the two empirical estimators and related LD theorems to two basic ill-posed inverse problems α and β. We note that other estimators in this family lack an LD footing and provide an extensive discussion of the implications of these results. The appendix contains formal statements regarding relevant LD theorems.


Information Sciences | 2010

The Pólya information divergence

Marian Grendar; Robert K. Niven

Extensions of Sanovs Theorem and the Conditional Limit Theorem (CoLT) are established for a multicolor Polya-Eggenberger (PE) urn sampling scheme, giving the Polya information divergence and a Polya extension to the Maximum Relative Entropy (MaxEnt) method. Polya MaxEnt includes the standard MaxEnt, as well as its variants used in Bose-Einstein, Fermi-Dirac and intermediate (Acharya-Swamy) statistics, as special cases. In the PE setting, standard MaxEnt is, in general, asymptotically inconsistent.


arXiv: Data Analysis, Statistics and Probability | 2004

Maximum Probability and Maximum Entropy methods: Bayesian interpretation

Marian Grendar

(Jaynes’) Method of (Shannon‐Kullback’s) Relative Entropy Maximization (REM or MaxEnt) can be — at least in the discrete case — according to the Maximum Probability Theorem (MPT) viewed as an asymptotic instance of the Maximum Probability method (MaxProb). A simple bayesian interpretation of MaxProb is given here. MPT carries the interpretation over into REM.


arXiv: Statistics Theory | 2001

MiniMax entropy and maximum likelihood: Complementarity of tasks, identity of solutions

Marian Grendar

Concept of exponential family is generalized by simple and general exponential form. Simple and general potential are introduced. Maximum Entropy and Maximum Likelihood tasks are defined. ML task on the simple exponential form and ME task on the simple potentials are proved to be complementary in set-up and identical in solutions. ML task on the general exponential form and ME task on the general potentials are weakly complementary, leading to the same necessary conditions. A hypothesis about complementarity of ML and MiniMax Entropy tasks and identity of their solutions, brought up by a special case analytical as well as several numerical investigations, is suggested in this case. MiniMax Ent can be viewed as a generalization of MaxEnt for parametric linear inverse problems, and its complementarity with ML as yet another argument in favor of Shannon’s entropy criterion.


International Journal of Bifurcation and Chaos | 2013

STRONG LAWS FOR RECURRENCE QUANTIFICATION ANALYSIS

Marian Grendar; Jana Majerová; Vladimír Špitalský

The recurrence rate and determinism are two of the basic complexity measures studied in the recurrence quantification analysis. In this paper, the recurrence rate and determinism are expressed in terms of the correlation sum, and strong laws of large numbers are given for them.


Entropy | 2006

Entropy and Effective Support Size

Marian Grendar

Notion of Effective size of support (Ess) of a random variable is introduced. A smallset of natural requirements that a measure of Ess should satisfy is presented. The measure withprescribed properties is in a direct (exp-) relationship to the family of R nyi’s α-entropies which eincludes also Shannon’s entropy H. Considerations of choice of the value of α imply that exp(H)appears to be the most appropriate measure of Ess.Entropy and Ess can be viewed thanks to their log / exp relationship as two aspects of the samething. In Probability and Statistics the Ess aspect could appear more basic than the entropic one.

Collaboration


Dive into the Marian Grendar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zora Lasabova

Comenius University in Bratislava

View shared research outputs
Top Co-Authors

Avatar

Jan Danko

Comenius University in Bratislava

View shared research outputs
Top Co-Authors

Avatar

Robert K. Niven

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

Pavol Zubor

Comenius University in Bratislava

View shared research outputs
Top Co-Authors

Avatar

Katarina Zelinova

Comenius University in Bratislava

View shared research outputs
Top Co-Authors

Avatar

Marianna Jagelkova

Comenius University in Bratislava

View shared research outputs
Top Co-Authors

Avatar

Veronika Holubekova

Comenius University in Bratislava

View shared research outputs
Top Co-Authors

Avatar

Zuzana Danková

Comenius University in Bratislava

View shared research outputs
Top Co-Authors

Avatar

Andrea Kapinová

Comenius University in Bratislava

View shared research outputs
Researchain Logo
Decentralizing Knowledge