Xavier Guyon
University of Paris
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Xavier Guyon.
Archive | 1992
Xavier Guyon; Hans R. Künsch
Because of their use as priors in image analysis, the interest in parameter estimation for Gibbs random fields has rosen recently. Gibbs fields form an exponential family, so maximum likelihood would be the estimator of first choice. Unfortunately it is extremly difficult to compute. Other estimators which are easier to compute have been proposed: the Coding and the pseudo-maximum likelihood estimator (Besag, 1974), a minimum chi-square estimator (Glotzl and Rauchenschwandtner, 1981; Possolo, 1986-a) and the conditional least squares estimator (Lele et Ord, 1986), cf the definitions below in section 2.2. - These estimators are all known to be consistent. Hence it is a natural question to compare efficiency among these simple estimators and with respect to the maximum likehood estimator. We do this here in the simplest non trivial case, the d-dimensional nearest neighbor isotropic Ising model with external field. We show that both the pseudo maximum likelihood and the conditional least squares estimator are asymptotically equivalent to a minimum chi-square estimator when the weight matrix for the latter is chosen appropriately (corollary 2). These weight matrices are different from the optimal matrix. Hence we expect also the resulting estimators to be different although in all our examples the maximum pseudo likelihood and the minimum chi-square estimator with optimal weight turned out to be asymptotically equivalent. In particular, our results do not confirm the superior behavior of minimum chi-square over pseudo maximun likehood reported in Possolo (1986a). By example, we show that conditional least squares and minimum chi-square with the identity matrix as weights can be worse than the optimal minimum chi-square estimator. Compared with the maximum likelihood, the easily computable estimators are not bad if the interaction is weak, but much worse if the interaction is strong. Our results suggest that their asymptotic efficiency tends to zero as one approaches the critical point.
Probability Theory and Related Fields | 1984
Xavier Guyon; Sylvia Richardson
RésuméNous étudions la vitesse de convergence du théorème de la limite centrale pour des champs de ℤd, faiblement dépendants: m-dépendant ou α-fortement mélangeant. Dès que le champ est dans L2+δ, δ>0, la vitesse de convergence obtenue est σn− (τ∧1) avec un facteur (log σn)a qui intervient quand α est à décroissance exponentielle et dans le cas m-dépendant quand δ≧1. Le cas où α est à décroissance puissance est aussi étudié. Ces résultats ne font intervenir ni la stationarité, ni la géométrie des domaines sur lesquels le T.L.C. est étudié.
Statistics | 2001
Xavier Guyon; C. Hardouin
This study deals with time dynamics of Markov fields defined on a finite set of sites with State Space E, focussing on Markow Chain Markow Field (MCMF) evolution. Such a model is characterized by two families of potentials:the instantaneous interaction potentials, and the time delay potentials. Four models are specified:auto-exponential dynamics (E=R+), auto-normal dynamics (E = R), auto-Poissonian dynamics (E = N) and auto-logistic dynamics (E qualitative and finite). Sufficient conditions ensuring ergodicity and strong law of large numbers are given by using a Lyapunov criterion of stability, and the conditional pseudo-likelihood statistics are summarized. We discuss the identification procedure of the two Markovian graphs and look for validation tests using martingale central limit theorems. An application to meteorological data illustrates such a modelling.
Probability Theory and Related Fields | 1987
Xavier Guyon
SummaryLet
Archive | 2010
Carlo Gaetan; Xavier Guyon
Statistics & Probability Letters | 2000
Xavier Guyon; Olivier Perrin
X_t ,t \in \mathbb{R}^d
Archive | 2010
Carlo Gaetan; Xavier Guyon
Statistics | 2007
Xavier Guyon; Besnik Pumo
be a stationary Gaussian random field, with covariance R. For d=1 and d=2, families of variations are described. The convergence in mean square of these variations and a subsequent identification of a model for X are studied. Under suitable glocal conditions for R, the behaviour of these variations depends on the local behaviour of R near the origin. The differences between the case d=1 and d=2 are particularly emphasised: for d=1, there exists only one variation; for d=2, several families of variations are available which provided a useful tool for identifying different models: for example, Orstein-Uhlenbeck processes can be identified in mean square on
Annales De L Institut Henri Poincare-probabilites Et Statistiques | 2002
Sandie Souchet; Xavier Guyon
Archive | 2010
Carlo Gaetan; Xavier Guyon
\mathbb{R}