Andrej Pázman
Comenius University in Bratislava
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andrej Pázman.
Statistics & Probability Letters | 2001
Andrej Pázman; Werner G. Müller
In this paper we consider optimal design of experiments in the case of correlated observations, when no replications are possible. This situation is typical when observing a random process or random field with known covariance structure. We present a theorem which demonstrates that the computation of optimum exact designs corresponds to solving minimization problems in terms of design measures.
Journal of Statistical Planning and Inference | 1998
Werner G. Müller; Andrej Pázman
The concept of design measures is fundamental in the classical setup of experiments without restrictions on replications of observations (resulting designs are called continuous or approximate). In this paper we extend this concept to experiments, when no such replications are allowed. The resulting interpretation of the design measure is different from the classical case. We introduce a corresponding information matrix, which approximates the Fisher information matrix for exact designs. The basic aim of the paper is to find an optimum exact design with a restricted total number of observations. The usefulness of the developed approach is illustrated by an algorithm that is employed to calculate optimal or improved designs numerically.
Statistics | 1984
Andrej Pázman
In the Model is a zero mean random vector with the convienience matrix σ2I(σ2 unknown), and where is the whole paper
Journal of Statistical Planning and Inference | 1992
Andrej Pázman; Luc Pronzato
Abstract Nonlinear experimental design relying on a basis less approximative than the asymptotic normality of the estimates is considered. Constraints on the model parameters are taken into account, and a new approximation for the density of constrained least-squares estimates is derived. This density is used to define an optimality criterion for experimental design, the optimization of which is carried out using stochastic approximation techniques. Various examples are considered.
Applications of Mathematics | 1999
Jean-Baptiste Denis; Andrej Pázman
General results giving approximate bias for nonlinear models with constrained parameters are applied to bilinear models in ANOVA framework, called biadditive models. Known results on the information matrix and the asymptotic variance matrix of the parameters are summarized, and the Jacobians and Hessians of the response and of the constraints are derived. These intermediate results are the basis for any subsequent second order study of the model. Despite the large number of parameters involved, bias formulae turn out to be quite simple due to the orthogonal structure of the model. In particular, the response estimators are shown to be approximately unbiased. Some simulations assess the validity of the approximations.
Statistics & Probability Letters | 1996
Andrej Pázman; Luc Pronzato
We consider new approximations for the marginal density of parameter estimates in nonlinear regression, and more generally for the density of any smooth scalar function G(y) with y normally distributed. These approximations are derived via a Dirac-function technique.
Archive | 1998
Andrej Pázman; Werner G. Müller
With the help of design measures well known from the classical design theory (Kiefer (1959)) we construct a smooth (differentiable) reformulation of the discontinuous problem of obtaining optimum exact designs for observations without replications and with known covariances. Typically such a problem may appear when we have to observe a random field with a parametrized mean. Since replications are not allowed, the design measures must be used and interpreted in a new way. In this sense we reexpress the information matrices, and we give formulae for directional derivatives of criteria functions. They may serve as a basis for iterative computation of optimum designs.
Statistics | 1990
Andrej Pázman
We consider Gaussian nonlinear regression models with constant information matrix ( = models with constant asymptotic variance) and models which are such after a repararnetrization (= “flat models”), including all one-dimensional nonlinear regression models. In is shown that a recently obtained nonasymptotical approximation of the probability density of the miximum likelihood (= least squares) estimator is particularly good in flat models. It is proved that under this approximative density the gradient of the squared distance between the true and the estimated means of the observed vector is nearly a normal random vector in models with constant information matrix. This allows to construct almost exact confidence regions in flat models, and to obtain approximative moments of the estimators
Statistics | 1987
Andrej Pázman
Explicit formulas approximating the density of the least squares estimates in the gaussian nonlinear regression aer discussed. A new, simpler proof is presented for the formula given in A. PAZMAN [6], and the terms evaluating the difference between the true and the approximative formulas are corrected. The presented investigation is nonasymptotic (i.e. for a fixed sample size).
Statistics | 1980
Andrej Pázman
In this survey paper the regression model with uncorrelated observations and its infinitedimensional generalization are considered. The stress is on a functional approach to the experiment in both models. Results of several authors on the convex properties of the decision problem connected with the experimental design are collected, The iterative methods of computing optimal designs for convex criteria functions are considered from a geometrical aspect, Singular designs and designs for the nonlinear estimation. are considered as well.