Ryan G. James
University of California, Davis
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ryan G. James.
Physics Letters A | 2014
Ryan G. James; Korana Burke; James P. Crutchfield
The hallmark of deterministic chaos is that it creates information---the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a systems intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information---the ephemeral information---is forgotten and a portion---the bound information---is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.
Chaos | 2010
James P. Crutchfield; Christopher J. Ellison; Ryan G. James; John R. Mahoney
We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the processs internal organization and by an observers model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a processs controllability. These tools lead to a new classification of a processs alternative representations in terms of minimality, synchronizability, and unifilarity.
Physical Review Letters | 2016
Ryan G. James; Nix Barnett; James P. Crutchfield
A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.
Chaos | 2011
John R. Mahoney; Christopher J. Ellison; Ryan G. James; James P. Crutchfield
We investigate a stationary processs crypticity--a measure of the difference between its hidden state information and its observed information--using the causal states of computational mechanics. Here, we motivate crypticity and cryptic order as physically meaningful quantities that monitor how hidden a hidden process is. This is done by recasting previous results on the convergence of block entropy and block-state entropy in a geometric setting, one that is more intuitive and that leads to a number of new results. For example, we connect crypticity to how an observer synchronizes to a process. We show that the block-causal-state entropy is a convex function of block length. We give a complete analysis of spin chains. We present a classification scheme that surveys stationary processes in terms of their possible cryptic and Markov orders. We illustrate related entropy convergence behaviors using a new form of foliated information diagram. Finally, along the way, we provide a variety of interpretations of crypticity and cryptic order to establish their naturalness and pervasiveness. This is also a first step in developing applications in spatially extended and network dynamical systems.
Entropy | 2017
Ryan G. James; James P. Crutchfield
Accurately determining dependency structure is critical to discovering a systems causal organization. We recently showed that the transfer entropy fails in a key aspect of this---measuring information flow---due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that this is true of all such Shannon information measures when used to analyze multivariate dependencies. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful dependency structure within joint probability distributions. Therefore, such information measures are inadequate for discovering intrinsic causal relations. We close by demonstrating that such distributions exist across an arbitrary set of variables.
Physical Review E | 2014
Ryan G. James; John R. Mahoney; Christopher J. Ellison; James P. Crutchfield
We consider two important time scales-the Markov and cryptic orders-that monitor how an observer synchronizes to a finitary stochastic process. We show how to compute these orders exactly and that they are most efficiently calculated from the ε-machine, a processs minimal unifilar model. Surprisingly, though the Markov order is a basic concept from stochastic process theory, it is not a probabilistic property of a process. Rather, it is a topological property and, moreover, it is not computable from any finite-state model other than the ε-machine. Via an exhaustive survey, we close by demonstrating that infinite Markov and infinite cryptic orders are a dominant feature in the space of finite-memory processes. We draw out the roles played in statistical mechanical spin systems by these two complementary length scales.
Chaos | 2011
Christopher J. Ellison; John R. Mahoney; Ryan G. James; James P. Crutchfield; Joerg Reichardt
We study dynamical reversibility in stationary stochastic processes from an information-theoretic perspective. Extending earlier work on the reversibility of Markov chains, we focus on finitary processes with arbitrarily long conditional correlations. In particular, we examine stationary processes represented or generated by edge-emitting, finite-state hidden Markov models. Surprisingly, we find pervasive temporal asymmetries in the statistics of such stationary processes. As a consequence, the computational resources necessary to generate a process in the forward and reverse temporal directions are generally not the same. In fact, an exhaustive survey indicates that most stationary processes are irreversible. We study the ensuing relations between model topology in different representations, the processs statistical properties, and its reversibility in detail. A processs temporal asymmetry is efficiently captured using two canonical unifilar representations of the generating model, the forward-time and reverse-time ε-machines. We analyze example irreversible processes whose ε-machine representations change size under time reversal, including one which has a finite number of recurrent causal states in one direction, but an infinite number in the opposite. From the forward-time and reverse-time ε-machines, we are able to construct a symmetrized, but nonunifilar, generator of a process--the bidirectional machine. Using the bidirectional machine, we show how to directly calculate a processs fundamental information properties, many of which are otherwise only poorly approximated via process samples. The tools we introduce and the insights we offer provide a better understanding of the many facets of reversibility and irreversibility in stochastic processes.
Physical Review E | 2016
Pooneh M. Ara; Ryan G. James; James P. Crutchfield
Modeling a temporal process as if it is Markovian assumes that the present encodes all of a processs history. When this occurs, the present captures all of the dependency between past and future. We recently showed that if one randomly samples in the space of structured processes, this is almost never the case. So, how does the Markov failure come about? That is, how do individual measurements fail to encode the past? and How many are needed to capture dependencies between the past and future? Here, we investigate how much information can be shared between the past and the future but not reflected in the present. We quantify this elusive information, give explicit calculational methods, and outline the consequences, the most important of which is that when the present hides past-future correlation or dependency we must move beyond sequence-based statistics and build state-based models.
Chaos | 2017
Hiroshi Ashikaga; Ryan G. James
A spiral wave is a macroscopic dynamics of excitable media that plays an important role in several distinct systems, including the Belousov-Zhabotinsky reaction, seizures in the brain, and lethal arrhythmia in the heart. Because the spiral wave dynamics can exhibit a wide spectrum of behaviors, its precise quantification can be challenging. Here we present a hybrid geometric and information-theoretic approach to quantifying the spiral wave dynamics. We demonstrate the effectiveness of our approach by applying it to numerical simulations of a two-dimensional excitable medium with different numbers and spatial patterns of spiral waves. We show that, by defining the information flow over the excitable medium, hidden coherent structures emerge that effectively quantify the information transport underlying the spiral wave dynamics. Most importantly, we find that some coherent structures become more clearly defined over a longer observation period. These findings provide validity with our approach to quantitatively characterize the spiral wave dynamics by focusing on information transport. Our approach is computationally efficient and is applicable to many excitable media of interest in distinct physical, chemical, and biological systems. Our approach could ultimately contribute to an improved therapy of clinical conditions such as seizures and cardiac arrhythmia by identifying potential targets of interventional therapies.
Chaos | 2018
Hiroshi Ashikaga; Ryan G. James
A rotor, the rotation center of spiral waves, has been proposed as a causal mechanism to maintain atrial fibrillation (AF) in human. However, our current understanding of the causality between rotors and spiral waves remains incomplete. One approach to improving our understanding is to determine the relationship between rotors and downward causation from the macro-scale collective behavior of spiral waves to the micro-scale behavior of individual components in a cardiac system. This downward causation is quantifiable as inter-scale information flow that can be used as a surrogate for the mechanism that maintains spiral waves. We used a numerical model of a cardiac system and generated a renormalization group with system descriptions at multiple scales. We found that transfer entropy quantified the upward and downward inter-scale information flow between micro- and macro-scale descriptions of the cardiac system with spiral waves. In addition, because the spatial profile of transfer entropy and intrinsic transfer entropy was identical, there were no synergistic effects in the system. Furthermore, inter-scale information flow significantly decreased as the description of the system became more macro-scale. Finally, downward information flow was significantly correlated with the number of rotors, but the higher numbers of rotors were not necessarily associated with higher downward information flow. This finding contradicts the concept that the rotors are the causal mechanism that maintains spiral waves, and may account for the conflicting evidence from clinical studies targeting rotors to eliminate AF.