Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter Ney is active.

Publication


Featured researches published by Peter Ney.


Stochastic Processes and their Applications | 1981

A refinement of the coupling method in renewal theory

Peter Ney

An overjump Markov chain associated with a pair of random walks is used to obtain a sharp estimate of their coupling time; and thence of the convergence rate of a renewal process. Lattice valued and non-singular cases are treated.


Ergodic Theory and Dynamical Systems | 2000

Cesaro mean distribution of group automata starting from measures with summable decay

Pablo A. Ferrari; Alejandro Maass; Servet Martínez; Peter Ney

Consider a finite Abelian group (G,+), with |G|=p^r, p a prime number, and F: G^N -> G^N the cellular automaton given by {F(x)}_n= A x_n + B x_{n+1} for any n in N, where A and B are integers relatively primes to p. We prove that if P is a translation invariant probability measure on G^Z determining a chain with complete connections and summable decay of correlations, then for any w= (w_i:i<0) the Cesaro mean distribution of the time iterates of the automaton with initial distribution P_w --the law P conditioned to w on the left of the origin-- converges to the uniform product measure on G^N. The proof uses a regeneration representation of P.


Annals of Applied Probability | 2004

Local limit theory and large deviations for supercritical Branching processes

Peter Ney; Anand N. Vidyashankar

In this paper we study several aspects of the growth of a supercritical Galton-Watson process {Z_n:n\ge1}, and bring out some criticality phenomena determined by the Schroder constant. We develop the local limit theory of Z_n, that is, the behavior of P(Z_n=v_n) as v_n\nearrow \infty, and use this to study conditional large deviations of {Y_{Z_n}:n\ge1}, where Y_n satisfies an LDP, particularly of {Z_n^{-1}Z_{n+1}:n\ge1} conditioned on Z_n\ge v_n.


Journal of Mathematical Analysis and Applications | 1965

The convergence of a random distribution function associated with a branching process

Peter Ney

This paper presents an extension of the results of [l] to age-dependent branching processes. At the same time (and without much additional difficulty) we deal with a branching process on a general state space, rather than with the special model of the binary cascade treated in [l]. We start with a “standard” age-dependent process, defined as in chapter VI of Harris [2]. An initial “parent” particle splits after a random time T into a random number of “offspring” particles. Each of the offspring acts independently as a parent, and after random times (independently distributed as 2’) produce the next generation of offspring, etc. Let N, denote the number of particles existing at time t. The process to be studied here is constructed from the standard one by associating with each particle a “ type,” namely, a point x in a d-dimensional Euclidian space Q. Thus at any given time, each particle existing at that time is to be considered as located at a point in 9. (In various applications the coordinates of x will be such quantities as the energy, size, age, location of the associated particle.) Our purpose is to study the diffusion of the particles throughout Q. Let A be a subset of Q, N,(A) be the number of particles in A at time t, and M,(A) = N,(A)/N, be the proportion of particles in A at t. Note that M,(e) is a random measure; i.e., for each sample path (realization) of the branching process, AM,(.) is a measure for each t. To obtain a nondegenerate limit law we shall let the set A vary with time, and consider a process of the form M,(A,). We shall show that by letting iI, grow in a suitable manner, we can attain the convergence (in mean square) of &‘,(A,) to a Gaussian probability function. This, essentially, is the content of Theorem 3 and Remark 3 below.


Journal of Theoretical Probability | 1995

Occupation measures for Markov chains

I. H. Dinwoodie; Peter Ney

We give simple proofs of large deviation theorems for the occupation measure of a Markov chain using a regeneration argument to establish existence and convexity theory to identify the rate function.


Archive | 1986

Some limit theorems for Markov additive processes

Peter Ney; Esa Nummelin

At the Semi-Markov Symposium we presented some new results on Markov-additive processes which will be published in Ney and Nummelin (1984), Ney and Nummelin (1985). In their proofs we used regeneration constructions similar to those in several previous papers (Athreya and Ney, 1978; Nummelin, 1978; Iscoe, Ney and Nummelin, 1984). The proof of the particular regeneration used in Ney and Nummelin (1984) was omitted, and we will now provide it here. We also prove a slight extension of the results in Iscoe, Ney and Nummelin (1984); Ney and Nummelin (1984), and summarize the results we announced at the symposium.


Archive | 1991

Branching Random Walk

Peter Ney

In a volume dedicated to Ted Harris, it is appropriate that there should be some discussion of branching processes, a subject of which he is one of the founders. In a series of papers in the 1940’s and 50’s (see references [1] to [9] at the end of this paper), culminating in his famous 1963 book “The Theory of Branching Processes” [10], he helped to lay the rigorous mathematical foundations of the subject, to answer a number of basic questions, and to show the direction of many future lines of research.


Advances in Applied Probability | 1974

CRITICAL BRANCHING PROCESSES

Peter Ney

This paper develops a comparison method for critical branching processes. The method is applied to prove the exponential limit law for the multi-type age-dependent process under second moment conditions. 1. Summary


Archive | 1972

Multi-Type Branching Processes

Krishna B. Athreya; Peter Ney

The processes we have studied till now have all consisted of indistinguishable particles. After starting with fixed unit lifetimes for the Galton-Watson process, we considered exponential, and then arbitrary lifetime distributions.


Archive | 1999

The Gibbs Conditioning Principle for Markov Chains

Ana Meda; Peter Ney

Let X 1, X 2,… be an irreducible Markov chain taking values in a measurable space (S, S), \(u:{S^2} \to {\mathbb{R}^d},{U_n} = \sum {_{i = 1}^n} u({X_i},{X_{i + 1}}),C \subset {\mathbb{R}^d}\) open and convex. Then conditioned on {U n ∈ nC} (and under some hypotheses on {X n }), it is shown that {X n } converges to a Markov chain, whose transition mechanism is specified.

Collaboration


Dive into the Peter Ney's collaboration.

Top Co-Authors

Avatar

Krishna B. Athreya

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ana Meda

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge