Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David P. Feldman is active.

Publication


Featured researches published by David P. Feldman.


Chaos | 2003

Regularities Unseen, Randomness Observed: Levels of Entropy Convergence

James P. Crutchfield; David P. Feldman

We study how the Shannon entropy of sequences produced by an information source converges to the sources entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of apparent memory stored in a source and the amounts of information that must be extracted from observations of a source in order for it to be optimally predicted and for an observer to synchronize to it. To measure the difficulty of synchronization, we define the transient information and prove that, for Markov processes, it is related to the total uncertainty experienced while synchronizing to a process. One consequence of ignoring a processs structural properties is that the missed regularities are converted to apparent randomness. We demonstrate that this problem arises particularly for settings where one has access only to short measurement sequences. Numerically and analytically, we determine the Shannon entropy growth curve, and related quantities, for a range of stochastic and deterministic processes. We conclude by looking at the relationships between a processs entropy convergence behavior and its underlying computational structure.


Physical Review E | 1997

Statistical Complexity of Simple 1D Spin Systems

James P. Crutchfield; David P. Feldman

We present exact results for two complementary measures of spatial structure generated by 1D spin systems with finite-range interactions. The first, excess entropy, measures the apparent spatial memory stored in configurations. The second, statistical complexity, measures the amount of memory needed to optimally predict the chain of spin values in configurations. These statistics capture distinct properties and are different from existing thermodynamic quantities.


Physical Review E | 2003

Structural information in two-dimensional patterns: Entropy convergence and excess entropy

David P. Feldman; James P. Crutchfield

We develop information-theoretic measures of spatial structure and pattern in more than one dimension. As is well known, the entropy density of a two-dimensional configuration can be efficiently and accurately estimated via a converging sequence of conditional entropies. We show that the manner in which these conditional entropies converge to their asymptotic value serves as a measure of global correlation and structure for spatial systems in any dimension. We compare and contrast entropy convergence with mutual-information and structure-factor techniques for quantifying and detecting spatial structure.


Advances in Complex Systems | 2001

Synchronizing to the Environment: Information Theoretic Constraints on Agent Learning

James P. Crutchfield; David P. Feldman

Using an information-theoretic framework, we examine how an intelligent agent, given an accurate model of its environment, synchronizes to the environment — i.e., comes to know in which state the environment is. We show that the total uncertainty experienced by the agent during the process is closely related to the transient information, a new quantity that captures the manner in which the environments entropy growth curve converges to its asymptotic form. We also discuss how an agents estimates of its environments structural properties are related to its estimate of the environment entropy rate. If structural properties are ignored, the missed regularities are converted to apparent randomness. Conversely, using representations that assume too much memory results in false predictability.


Advances in Complex Systems | 2004

SYNCHRONIZING TO PERIODICITY: THE TRANSIENT INFORMATION AND SYNCHRONIZATION TIME OF PERIODIC SEQUENCES

David P. Feldman; James P. Crutchfield

We analyze how difficult it is to synchronize to a periodic sequence whose structure is known, when an observer is initially unaware of the sequences phase. We examine the transient information T, a recently introduced information-theoretic quantity that measures the uncertainty an observer experiences while synchronizing to a sequence. We also consider the synchronization time τ, which is the average number of measurements required to infer the phase of a periodic signal. We calculate T and τ for all periodic sequences up to and including period 23. We show which sequences of a given period have the maximum and minimum possible T and τ values, develop analytic expressions for the extreme values, and show that in these cases the transient information is the product of the total phase information and the synchronization time. Despite the latter result, our analyses demonstrate that the transient information and synchronization time capture different and complementary structural properties of individual periodic sequences — properties, moreover, that are distinct from source entropy rate and mutual information measures, such as the excess entropy.


Chaos | 2011

Local entropy and structure in a two-dimensional frustrated system

Matthew D. Robinson; David P. Feldman; Susan R. McKay

We calculate the local contributions to the Shannon entropy and excess entropy and use these information theoretic measures as quantitative probes of the order arising from quenched disorder in the diluted Ising antiferromagnet on a triangular lattice. When one sublattice is sufficiently diluted, the system undergoes a temperature-driven phase transition, with the other two sublattices developing magnetizations of equal magnitude and opposite sign as the system is cooled.(1) The diluted sublattice has no net magnetization but exhibits spin glass ordering. The distribution of local entropies shows a dramatic broadening at low temperatures; this indicates that the systems total entropy is not shared equally across the lattice. The entropy contributions from some regions exhibit local reentrance, although the entropy of the system decreases monotonically as expected. The average excess entropy shows a sharp peak at the critical temperature, showing that the excess entropy is sensitive to the structural changes that occur as a result of the spin glass ordering.


Physics Letters A | 1998

Measures of Statistical Complexity: Why?

David P. Feldman; James P. Crutchfield


Chaos | 2008

The organization of intrinsic computation: Complexity-entropy diagrams and the diversity of natural information processing

David P. Feldman; Carl S. McTague; James P. Crutchfield


Physical Review E | 2000

Comments on ``Simple Measure for Complexity''

James P. Crutchfield; David P. Feldman; Cosma Rohilla Shalizi


Archive | 2012

Chaos and Fractals: An Elementary Introduction

David P. Feldman

Collaboration


Dive into the David P. Feldman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge