Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James MacLaurin is active.

Publication


Featured researches published by James MacLaurin.


Siam Journal on Applied Dynamical Systems | 2016

A General Framework for Stochastic Traveling Waves and Patterns, with Application to Neural Field Equations

James Inglis; James MacLaurin

In this paper we present a general framework in which to rigorously study the effect of spatio-temporal noise on traveling waves and stationary patterns. In particular, the framework can incorporate versions of the stochastic neural field equation that may exhibit traveling fronts, pulses, or stationary patterns. To do this, we first formulate a local SDE that describes the position of the stochastic wave up until a discontinuity time, at which point the position of the wave may jump. We then study the local stability of this stochastic front, obtaining a result that recovers a well-known deterministic result in the small-noise limit. We finish with a study of the long-time behavior of the stochastic wave.


Entropy | 2015

Asymptotic Description of Neural Networks with Correlated Synaptic Weights

Olivier D. Faugeras; James MacLaurin

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.


Entropy | 2014

A Large Deviation Principle and an Expression of the Rate Function for a Discrete Stationary Gaussian Process

Olivier D. Faugeras; James MacLaurin

We prove a large deviation principle for a stationary Gaussian process over Rb,indexed by Ζd (for some positive integers d and b), with positive definite spectral density, andprovide an expression of the corresponding rate function in terms of the mean of the processand its spectral density. This result is useful in applications where such an expression isneeded.


arXiv: Probability | 2018

On uniform propagation of chaos

Jamil Salhi; James MacLaurin; Salwa Toumi

In this paper we obtain a time-uniform propagation estimate for a system of interacting diffusion processes. Using a well defined metric function h , our result guarantees a time-uniform estimate for the convergence of a class of interacting stochastic differential equations towards their mean field limit, under conditions that ensure that the decay associated to the internal dynamics term dominates the interaction and noise terms. Our result should have diverse applications, particularly in neuroscience, and allows for models more elaborate than the one of Wilson and Cowan. In particular, the internal dynamics need not be that of linear decay.


Chaos | 2018

A variational method for analyzing limit cycle oscillations in stochastic hybrid systems

Paul C. Bressloff; James MacLaurin

Many systems in biology can be modeled through ordinary differential equations, which are piece-wise continuous, and switch between different states according to a Markov jump process known as a stochastic hybrid system or piecewise deterministic Markov process (PDMP). In the fast switching limit, the dynamics converges to a deterministic ODE. In this paper, we develop a phase reduction method for stochastic hybrid systems that support a stable limit cycle in the deterministic limit. A classic example is the Morris-Lecar model of a neuron, where the switching Markov process is the number of open ion channels and the continuous process is the membrane voltage. We outline a variational principle for the phase reduction, yielding an exact analytic expression for the resulting phase dynamics. We demonstrate that this decomposition is accurate over timescales that are exponential in the switching rate ϵ-1. That is, we show that for a constant C, the probability that the expected time to leave an O(a) neighborhood of the limit cycle is less than T scales as T exp (-Ca/ϵ).


Journal of Mathematical Neuroscience | 2018

Stochastic Hybrid Systems in Cellular Neuroscience

Paul C. Bressloff; James MacLaurin

We review recent work on the theory and applications of stochastic hybrid systems in cellular neuroscience. A stochastic hybrid system or piecewise deterministic Markov process involves the coupling between a piecewise deterministic differential equation and a time-homogeneous Markov chain on some discrete space. The latter typically represents some random switching process. We begin by summarizing the basic theory of stochastic hybrid systems, including various approximation schemes in the fast switching (weak noise) limit. In subsequent sections, we consider various applications of stochastic hybrid systems, including stochastic ion channels and membrane voltage fluctuations, stochastic gap junctions and diffusion in randomly switching environments, and intracellular transport in axons and dendrites. Finally, we describe recent work on phase reduction methods for stochastic hybrid limit cycle oscillators.


Stochastics and Dynamics | 2017

Mean field dynamics of a Wilson–Cowan neuronal network with nonlinear coupling term

James MacLaurin; Jamil Salhi; Salwa Toumi

In this paper we prove the propagation of chaos property for an ensemble of interacting neurons subject to independent Brownian noise. The propagation of chaos property means that in the large network size limit, the neurons behave as if they are probabilistically independent. The model for the internal dynamics of the neurons is taken to be that of Wilson and Cowan, and we consider there to be multiple different populations. The synaptic connections are modeled with a nonlinear “electrical” model. The nonlinearity of the synaptic connections means that our model lies outside the scope of classical propagation of chaos results. We obtain the propagation of chaos result by taking advantage of the fact that the mean-field equations are Gaussian, which allows us to use Borell’s Inequality to prove that its tails decay exponentially.


Entropy | 2014

A Representation of the Relative Entropy with Respect to a Diffusion Process in Terms of Its Infinitesimal Generator

Olivier D. Faugeras; James MacLaurin

In this paper we derive an integral (with respect to time) representation of the relative entropy (or Kullback-Leibler Divergence) R(µ||P), where µ and P are measures on C([0, T ]; R d). The underlying measure P is a weak solution to a Martingale Problem with continuous coefficients. Our representation is in the form of an integral with respect to its infinitesimal generator. This representation is of use in statistical inference (particularly involving medical imaging). Since R(µ||P) governs the exponential rate of convergence of the empirical measure (according to Sanovs Theorem), this representation is also of use in the numerical and analytical investigation of finite-size effects in systems of interacting diffusions.


Archive | 2014

Large Deviations of an Ergodic Synchronous Neural Network with Learning

Olivier D. Faugeras; James MacLaurin


Siam Journal on Applied Dynamical Systems | 2018

A Variational Method for Analyzing Stochastic Limit Cycle Oscillators

Paul C. Bressloff; James MacLaurin

Collaboration


Dive into the James MacLaurin's collaboration.

Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge