Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sarah Marzen is active.

Publication


Featured researches published by Sarah Marzen.


Frontiers in Computational Neuroscience | 2015

Time resolution dependence of information measures for spiking neurons: scaling and universality.

Sarah Marzen; Michael R. DeWeese; James P. Crutchfield

The mutual information between stimulus and spike-train response is commonly used to monitor neural coding efficiency, but neuronal computation broadly conceived requires more refined and targeted information measures of input-output joint processes. A first step toward that larger goal is to develop information measures for individual output processes, including information generation (entropy rate), stored information (statistical complexity), predictable information (excess entropy), and active information accumulation (bound information rate). We calculate these for spike trains generated by a variety of noise-driven integrate-and-fire neurons as a function of time resolution and for alternating renewal processes. We show that their time-resolution dependence reveals coarse-grained structural properties of interspike interval statistics; e.g., τ-entropy rates that diverge less quickly than the firing rate indicated by interspike interval correlations. We also find evidence that the excess entropy and regularized statistical complexity of different types of integrate-and-fire neurons are universal in the continuous-time limit in the sense that they do not depend on mechanism details. This suggests a surprising simplicity in the spike trains generated by these model neurons. Interestingly, neurons with gamma-distributed ISIs and neurons whose spike trains are alternating renewal processes do not fall into the same universality class. These results lead to two conclusions. First, the dependence of information measures on time resolution reveals mechanistic details about spike train generation. Second, information measures can be used as model selection tools for analyzing spike train processes.


Entropy | 2014

Information Anatomy of Stochastic Equilibria

Sarah Marzen; James P. Crutchfield

A stochastic nonlinear dynamical system generates information, as measured by its entropy rate. Some---the ephemeral information---is dissipated and some---the bound information---is actively stored and so affects future behavior. We derive analytic expressions for the ephemeral and bound informations in the limit of small-time discretization for two classical systems that exhibit dynamical equilibria: first-order Langevin equations (i) where the drift is the gradient of a potential function and the diffusion matrix is invertible and (ii) with a linear drift term (Ornstein-Uhlenbeck) but a noninvertible diffusion matrix. In both cases, the bound information is sensitive only to the drift, while the ephemeral information is sensitive only to the diffusion matrix and not to the drift. Notably, this information anatomy changes discontinuously as any of the diffusion coefficients vanishes, indicating that it is very sensitive to the noise structure. We then calculate the information anatomy of the stochastic cusp catastrophe and of particles diffusing in a heat bath in the overdamped limit, both examples of stochastic gradient descent on a potential landscape. Finally, we use our methods to calculate and compare approximations for the so-called time-local predictive information for adaptive agents.


Entropy | 2015

Informational and Causal Architecture of Discrete-Time Renewal Processes

Sarah Marzen; James P. Crutchfield

Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states), calculate the historical memory capacity required to store those states (statistical complexity), delineate what information is predictable (excess entropy), and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use these formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state epsilon-machine presentation. All in all, the results lay the groundwork for analyzing processes with infinite statistical complexity and infinite excess entropy.


Bulletin of the American Physical Society | 2017

The evolution of lossy compression

Sarah Marzen; Simon DeDeo

In complex environments, there are costs to both ignorance and perception. An organism needs to track fitness-relevant information about its world, but the more information it tracks, the more resources it must devote to perception. As a first step towards a general understanding of this trade-off, we use a tool from information theory, rate–distortion theory, to study large, unstructured environments with fixed, randomly drawn penalties for stimuli confusion (‘distortions’). We identify two distinct regimes for organisms in these environments: a high-fidelity regime where perceptual costs grow linearly with environmental complexity, and a low-fidelity regime where perceptual costs are, remarkably, independent of the number of environmental states. This suggests that in environments of rapidly increasing complexity, well-adapted organisms will find themselves able to make, just barely, the most subtle distinctions in their environment.


Journal of Statistical Physics | 2017

Structure and Randomness of Continuous-Time Discrete-Event Processes

Sarah Marzen; James P. Crutchfield

Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process’ intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (


european signal processing conference | 2015

Exploring discrete approaches to lossy compression schemes for natural image patches

Ram Mehta; Sarah Marzen; Christopher J. Hillar


Entropy | 2018

Intrinsic Computation of a Monod-Wyman-Changeux Molecule

Sarah Marzen

\epsilon


data compression conference | 2017

Revisiting Perceptual Distortion for Natural Images: Mean Discrete Structural Similarity Index

Christopher J. Hillar; Sarah Marzen


Journal of Molecular Biology | 2014

Corrigendum to "Statistical Mechanics of Monod-Wyman-Changeux" (J Mol Biol 425 (9) (May 13 2013) 1433-1460)

Sarah Marzen; Hernan G. Garcia; Rob Phillips

ϵ-machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.


BMC Neuroscience | 2013

How efficient coding of binocular disparity statistics in the primary visual cortex influences eye rotation strategy

Sarah Marzen; Joel Zylberberg; Michael R. DeWeese

Optimal compressions in a rate-distortion sense are usually discrete random variables, so clever discretizations of natural images might be key to developing better compression schemes. A new image compression method achieved good perceptual coding performance by using as primitives memories of a Hopfield network trained on discretized natural images. Here we explore why Hopfield network fixed-points are good lossy perceptual features even though the implied generative model (a second-order Lenz-Ising model) does not provide a state-of-the-art match to the true probability distribution of discretized natural images. Even so, we demonstrate that this deterministic coding scheme can achieve near-optimality by comparing with the rate-distortion function for discretized natural image patches.

Collaboration


Dive into the Sarah Marzen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rob Phillips

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dowman P. Varn

University of California

View shared research outputs
Top Co-Authors

Avatar

Mike DeWeese

University of California

View shared research outputs
Top Co-Authors

Avatar

Ram Mehta

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge