Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dan Crisan is active.

Publication


Featured researches published by Dan Crisan.


IEEE Transactions on Signal Processing | 2002

A survey of convergence results on particle filtering methods for practitioners

Dan Crisan; Arnaud Doucet

Optimal filtering problems are ubiquitous in signal processing and related fields. Except for a restricted class of models, the optimal filter does not admit a closed-form expression. Particle filtering methods are a set of flexible and powerful sequential Monte Carlo methods designed to. solve the optimal filtering problem numerically. The posterior distribution of the state is approximated by a large set of Dirac-delta masses (samples/particles) that evolve randomly in time according to the dynamics of the model and the observations. The particles are interacting; thus, classical limit theorems relying on statistically independent samples do not apply. In this paper, our aim is to present a survey of convergence results on this class of methods to make them accessible to practitioners.


Archive | 2009

Fundamentals of stochastic filtering

Alan Bain; Dan Crisan

Filtering Theory.- The Stochastic Process ?.- The Filtering Equations.- Uniqueness of the Solution to the Zakai and the Kushner-Stratonovich Equations.- The Robust Representation Formula.- Finite-Dimensional Filters.- The Density of the Conditional Distribution of the Signal.- Numerical Algorithms.- Numerical Methods for Solving the Filtering Problem.- A Continuous Time Particle Filter.- Particle Filters in Discrete Time.


Archive | 2001

Particle Filters — A Theoretical Perspective

Dan Crisan

The purpose of this chapter is to present a rigorous mathematical treatment of the convergence of particle filters. In general, we follow the notation and settings suggested by the editors, any extra notation being defined in the next section. Section 2.3.1 contains the main results of the paper: Theorems 2.3.1 and 2.3.2 provide necessary and sufficient conditions for the convergence of the particle filter to the posterior distribution of the signal. As an application of these results, we prove the convergence of a certain class of particle filters. This class includes several known filters (such as those presented in (Carpenter, Clifford and Fearnhead 1999b, Crisan, Del Moral and Lyons 1999, Gordon et al. 1993), but is by no means the most general one. Finally, we discuss some of the issues that are relevant in applications and which arise from the theoretical analysis of these methods.


Annals of Applied Probability | 2014

On the stability of sequential Monte Carlo methods in high dimensions

Alexandros Beskos; Dan Crisan; Ajay Jasra

We investigate the stability of a Sequential Monte Carlo (SMC) method applied to the problem of sampling from a target distribution on Rd for large d. It is well known [Bengtsson, Bickel and Li, In Probability and Statistics: Essays in Honor of David A. Freedman, D. Nolan and T. Speed, eds. (2008) 316–334 IMS; see also Pushing the Limits of Contemporary Statistics (2008) 318–32 9 IMS, Mon. Weather Rev. (2009) 136 (2009) 4629–4640] that using a single importance sampling step, one produces an approximation for the target that deteriorates as the dimension d increases, unless the number of Monte Carlo samples N increases at an exponential rate in d. We show that this degeneracy can be avoided by introducing a sequence of artificial targets, starting from a “simple” density and moving to the one of interest, using an SMC method to sample from the sequence; see, for example, Chopin [Biometrika 89 (2002) 539–551]; see also [J. R. Stat. Soc. Ser. B Stat. Methodol. 68 (2006) 411–436, Phys. Rev. Lett. 78 (1997) 2690–2693, Stat. Comput. 11 (2001) 125–139]. Using this class of SMC methods with a fixed number of samples, one can produce an approximation for which the effective sample size (ESS) converges to a random variable eN as d → ∞ with 1 < eN < N. The convergence is achieved with a computational cost proportional to Nd2. If eN � N, we can raise its value by introducing a number of resampling steps, say m (where m is independent of d). In this case, the ESS converges to a random variable eN,m as d → ∞ and limm→∞ eN,m = N. Also, we show that the Monte Carlo error for estimating a fixed-dimensional marginal expectation is of order √ 1 N uniformly in d. The results imply that, in high dimensions, SMC algorithms can efficiently control the variability of the importance sampling weights and estimate fixed-dimensional marginals at a cost which is less than exponential in d and indicate that resampling leads to a reduction in the Monte Carlo error and increase in the ESS. All of our analysis is made under the assumption that the target density is i.i.d.


Siam Journal on Applied Mathematics | 1998

Convergence of a branching particle method to the solution of the Zakai equation

Dan Crisan; Jessica G. Gaines; Terry Lyons

We construct a sequence of branching particle systems Un convergent in distribution to the solution of the Zakai equation. The algorithm based on this result can be used to solve numerically the filtering problem. The result is an improvement of the one presented in a recent paper [Crisan and T. Lyons, Prob. Theory Related Fields, 109 (1997), pp. 217--244], because it eliminates the extra degree of randomness introduced there.


Stochastics An International Journal of Probability and Stochastic Processes | 2010

Approximate McKean–Vlasov representations for a class of SPDEs

Dan Crisan; Jie Xiong

The solution of a class of linear stochastic partial differential equations is approximated using Clarks robust representation approach. The ensuing approximations are shown to coincide with the time marginals of solutions of a certain McKean–Vlasov type equation. We prove existence and uniqueness of the solution of the McKean–Vlasov equation.


Siam Journal on Financial Mathematics | 2012

Solving Backward Stochastic Differential Equations Using the Cubature Method: Application to Nonlinear Pricing

Dan Crisan; Konstantinos Manolarakis

We are concerned with the numerical solution of a class of backward stochastic differential equations (BSDEs), where the terminal condition is a function of


Monte Carlo Methods and Applications | 2002

Minimal Entropy Approximations and Optimal Algorithms

Dan Crisan; Terry Lyons

X_T


Advances in Applied Probability | 2014

Error bounds and normalising constants for sequential Monte Carlo samplers in high dimensions

Alexandros Beskos; Dan Crisan; Ajay Jasra; Nick Whiteley

, where


Bernoulli | 2014

Particle-kernel estimation of the filter density in state-space models

Dan Crisan; Joaquín Míguez

X=\{X_t,t\in [0,T]\}

Collaboration


Dive into the Dan Crisan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ajay Jasra

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Jie Xiong

University of Tennessee

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

François Delarue

University of Nice Sophia Antipolis

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Paulin

National University of Singapore

View shared research outputs
Researchain Logo
Decentralizing Knowledge