aa r X i v : . [ m a t h - ph ] M a y Works on an information geometrodynamical approach to chaos
Carlo Cafaro ∗ Department of Physics, State University of New York at Albany-SUNY,1400 Washington Avenue, Albany, NY 12222, USA
In this paper, I propose a theoretical information-geometric framework suitable to characterizechaotic dynamical behavior of arbitrary complex systems on curved statistical manifolds. Specif-ically, I present an information-geometric analogue of the Zurek-Paz quantum chaos criterion oflinear entropy growth and an information-geometric characterization of regular and chaotic quan-tum energy level statistics.
PACS numbers: 02.50.Tt- Inference methods; 02.40.Ky- Riemannian geometry; 02.50.Cw- Probabilitytheory; 05.45.-a- Nonlinear dynamics and chaos
I. INTRODUCTION
The study of complexity [1] has created a new set of ideas on how very simple systems may give rise to very complexbehaviors. Moreover, in many cases, the ”laws of complexity” have been found to hold universally, caring not at allfor the details of the system’s constituents. Chaotic behavior is a particular case of complex behavior and it will bethe object of the present work.In this paper I make use of the so-called Entropic Dynamics (ED) [2]. ED is a theoretical framework that arisesfrom the combination of inductive inference (Maximum Entropy Methods (ME), [3]) and Information Geometry(IG) [4]. The most intriguing question being pursued in ED stems from the possibility of deriving dynamics frompurely entropic arguments. This is clearly valuable in circumstances where microscopic dynamics may be too farremoved from the phenomena of interest, such as in complex biological or ecological systems, or where it may justbe unknown or perhaps even nonexistent, as in economics. It has already been shown that entropic arguments doaccount for a substantial part of the formalism of quantum mechanics, a theory that is presumably fundamental [5].Perhaps the fundamental theories of physics are not so fundamental; they may just be consistent, objective waysto manipulate information. Following this line of thought, I extend the applicability of ED to temporally-complex(chaotic) dynamical systems on curved statistical manifolds and identify relevant measures of chaoticity of such aninformation geometrodynamical approach to chaos (IGAC).The layout of the paper is as follows. In the next Section, I give an introduction to the main features of our IGAC.In Section III, I apply my theoretical construct to three complex systems. First, I study the chaotic behavior ofan ED Gaussian model describing an arbitrary system of l degrees of freedom and show that the hyperbolicity ofthe non-maximally symmetric 2 l -dimensional statistical manifold M s underlying such ED Gaussian model leads tolinear information geometrodynamical entropy (IGE) growth and to exponential divergence of the Jacobi vector fieldintensity. An information-geometric analogue of the Zurek-Paz quantum chaos criterion of linear entropy growth andan information-geometric characterization of regular and chaotic quantum energy level statistics are presented.I emphasize that I have omitted technical details that will appear elsewhere. However, some applications of myIGAC to low dimensional chaotic systems can be found in my previous articles [6, 7, 8, 9]. Finally, in Section IV Ipresent my conclusions and suggest further research directions. II. THE INFORMATION GEOMETRODYNAMICAL APPROACH TO CHAOS: GENERALFORMALISM
The IGAC is an application of ED to complex systems of arbitrary nature. ED is a form of information-constraineddynamics built on curved statistical manifolds M S where elements of the manifold are probability distributions { P ( X | Θ) } that are in a one-to-one relation with a suitable set of macroscopic statistical variables { Θ } that providea convenient parametrization of points on M S . The set { Θ } is called the parameter space D Θ of the system.In what follows, I schematically outline the main points underlying the construction of an arbitrary form of entropicdynamics. First, the microstates of the system under investigation must be defined. For the sake of simplicity, I ∗ Electronic address: [email protected] assume the system is characterized by an l -dimensional microspace with microstates { x k } where k = 1,..., l . Thesemicrostates can be interacting, non-interacting, or I may have no relevant information concerning their microscopicdynamics. Indeed, the main goal of an ED model is that of inferring ”macroscopic predictions” in the absence ofdetailed knowledge of the microscopic nature of arbitrary complex systems. Once the microstates have been defined,I have to select the relevant information about such microstates. In other words, I have to select the macrospace ofthe system. For the sake of the argument, I assume that our microstates are Gaussian-distributed. They are definedby 2 l -information constraints, for example their expectation values µ k and variances σ k . h x k i ≡ µ k and (cid:16)D ( x k − h x k i ) E(cid:17) ≡ σ k . (1)In addition to information constraints, each Gaussian distribution p k ( x k | µ k , σ k ) of each microstate x k must satisfythe usual normalization conditions, + ∞ Z −∞ dx k p k ( x k | µ k , σ k ) = 1 (2)where p k ( x k | µ k , σ k ) = (cid:0) πσ k (cid:1) − exp − ( x k − µ k ) σ k ! . (3)Once the microstates have been defined and the relevant (linear or nonlinear) information constraints selected, I amleft with a set of probability distributions p ( X | Θ) = l Y k =1 p k ( x k | µ k , σ k ) encoding the relevant available informationabout the system where X is the l -dimensional microscopic vector with components ( x ,..., x l ) and Θ is the 2 l -dimensional macroscopic vector with coordinates ( µ ,..., µ l ; σ ,..., σ l ). The set { Θ } define the 2 l -dimensional spaceof macrostates of the system, the statistical manifold M S . A measure of distinguishability among macrostates isobtained by assigning a probability distribution P ( X | Θ) ∋ M S to each macrostate Θ . Assignment of a probabilitydistribution to each state endows M S with a metric structure. Specifically, the Fisher-Rao information metric g µν (Θ)[4], g µν (Θ) = Z dXp ( X | Θ) ∂ µ log p ( X | Θ) ∂ ν log p ( X | Θ) , (4)with µ , ν = 1,..., 2 l and ∂ µ = ∂∂ Θ µ , defines a measure of distinguishability among macrostates on M S . The statisticalmanifold M S , M S = ( p ( X | Θ) = l Y k =1 p k ( x k | µ k , σ k ) ) , (5)is defined as the set of probabilities { p ( X | Θ) } described above where X ∈ R N , Θ ∈ D Θ = [ I µ × I σ ] N . The parameterspace D Θ (homeomorphic to M S ) is the direct product of the parameter subspaces I µ and I σ , where (unless specifiedotherwise) I µ = ( −∞ , + ∞ ) µ and I σ = (0, + ∞ ) σ . Once M S and D Θ are defined, the ED formalism providesthe tools to explore dynamics driven on M S by entropic arguments. Specifically, given a known initial macrostateΘ (initial) (probability distribution), and that the system evolves to a final known macrostate Θ (final) , the possibletrajectories of the system are examined in the ED approach using ME methods.I emphasize ED can be derived from a standard principle of least action (of Jacobi type). The geodesic equationsfor the macrovariables of the Gaussian ED model are given by nonlinear second order coupled ordinary differentialequations, d Θ µ dτ + Γ µνρ d Θ ν dτ d Θ ρ dτ = 0. (6)The geodesic equations in (6) describe a reversible dynamics whose solution is the trajectory between an initial Θ (initial) and a final macrostate Θ (final) . The trajectory can be equally well traversed in both directions. Given the Fisher-Raoinformation metric, I can apply standard methods of Riemannian differential geometry to study the information-geometric structure of the manifold M S underlying the entropic dynamics. Connection coefficients Γ ρµν , Ricci tensor R µν , Riemannian curvature tensor R µνρσ , sectional curvatures K M S , scalar curvature R M S , Weyl anisotropy tensor W µνρσ , Killing fields ξ µ and Jacobi fields J µ can be calculated in the usual way.To characterize the chaotic behavior of complex entropic dynamical systems, I are mainly concerned with the signs ofthe scalar and sectional curvatures of M S , the asymptotic behavior of Jacobi fields J µ on M S , the existence of Killingvectors ξ µ (or existence of a non-vanishing Weyl anisotropy tensor, the anisotropy of the manifold underlying systemdynamics plays a significant role in the mechanism of instability) and the asymptotic behavior of the information-geometrodynamical entropy (IGE) S M S (see (9)). It is crucial to observe that true chaos is identified by the occurrenceof two features: 1) strong dependence on initial conditions and exponential divergence of the Jacobi vector fieldintensity, i.e., stretching of dynamical trajectories; 2) compactness of the configuration space manifold, i.e., folding ofdynamical trajectories. The negativity of the Ricci scalar R M S , R M S = R µνρσ g µρ g νσ = X ρ = σ K M S ( e ρ , e σ ) , (7)implies the existence of expanding directions in the configuration space manifold M s . Indeed, since R M S is the sumof all sectional curvatures of planes spanned by pairs of orthonormal basis elements (cid:8) e ρ = ∂ Θ ρ (cid:9) , the negativity of theRicci scalar is only a sufficient (not necessary) condition for local instability of geodesic flow. For this reason, thenegativity of the scalar provides a strong criterion of local instability. Scenarios may arise where negative sectionalcurvatures are present, but the positive ones could prevail in the sum so that the Ricci scalar is non-negative despitethe instability in the flow in those directions. Consequently, the signs of K M S are of primary significance for theproper characterization of chaos.A powerful mathematical tool to investigate the stability or instability of a geodesic flow is the Jacobi-Levi-Civitaequation (JLC equation) for geodesic spread, D J µ Dτ + R µνρσ ∂ Θ ν ∂τ J ρ ∂ Θ σ ∂τ = 0. (8)The JLC-equation covariantly describes how nearby geodesics locally scatter and relates the stability or instability of ageodesic flow with curvature properties of the ambient manifold. Finally, the asymptotic regime of diffusive evolutiondescribing the possible exponential increase of average volume elements on M s provides another useful indicator ofdynamical chaoticity. The exponential instability characteristic of chaos forces the system to rapidly explore largeareas (volumes) of the statistical manifold. It is interesting to note that this asymptotic behavior appears also in theconventional description of quantum chaos where the entropy (von Neumann) increases linearly at a rate determinedby the Lyapunov exponents. The linear increase of entropy as a quantum chaos criterion was introduced by Zurekand Paz [10]. In my information-geometric approach a relevant quantity that can be useful to study the degree ofinstability characterizing ED models is the information geometrodynamical entropy (IGE) defined as [7], S M s ( τ ) def = lim τ →∞ log V M s with V M s ( τ ) = 1 τ τ Z dτ ′ Z M s √ gd l Θ (9)and g = | det ( g µν ) | . IGE is the asymptotic limit of the natural logarithm of the statistical weight defined on M s andrepresents a measure of temporal complexity of chaotic dynamical systems whose dynamics is underlined by a curvedstatistical manifold. In conventional approaches to chaos, the notion of entropy is introduced, in both classical andquantum physics, as the missing information about the systems fine-grained state [11]. For a classical system, supposethat the phase space is partitioned into very fine-grained cells of uniform volume ∆ v , labelled by an index j . If onedoes not know which cell the system occupies, one assigns probabilities p j to the various cells; equivalently, in thelimit of infinitesimal cells, one can use a phase-space density ρ ( X j ) = p j ∆ v . Then, in a classical chaotic evolution, theasymptotic expression of the information needed to characterize a particular coarse-grained trajectory out to time τ is given by the Shannon information entropy (measured in bits), S (chaotic)classical = − Z dXρ ( X ) log ( ρ ( X ) ∆ v )= − X j p j log p j ∼ K τ . (10)where ρ ( X ) is the phase-space density and p j = v j ∆ v is the probability for the corresponding coarse-grained trajectory. S (chaotic)classical is the missing information about which fine-grained cell the system occupies. The quantity K represents thelinear rate of information increase and it is called the Kolmogorov-Sinai entropy (or metric entropy) ( K is the sum ofpositive Lyapunov exponents, K = P j λ j ). K quantifies the degree of classical chaos. III. THE INFORMATION GEOMETRODYNAMICAL APPROACH TO CHAOS: APPLICATIONS
In this Section I present three applications of the IGAC. First, I study the chaotic behavior of an ED Gaussianmodel describing an arbitrary system of l degrees of freedom and show that the hyperbolicity of the non-maximallysymmetric 2 l -dimensional statistical manifold M s underlying such ED Gaussian model leads to linear informationgeometrodynamical entropy (IGE) growth and to exponential divergence of the Jacobi vector field intensity. I alsopresent an information-geometric analogue of the Zurek-Paz quantum chaos criterion of linear entropy growth. Thisanalogy is presented by studying the information geometrodynamics of ensemble of random frequency macroscopicinverted harmonic oscillators. Finally, I apply the IGAC to study the entropic dynamics on curved statistical manifoldsinduced by classical probability distributions of common use in the study of regular and chaotic quantum energy levelstatistics. In doing so, I suggest an information-geometric characterization of regular and chaotic quantum energylevel statistics.As I said in the Introduction, I have omitted technical details that will appear elsewhere. However, my previousworks (especially ([7])) may be very useful references in order to clarify the following applications. A. Chaotic behavior of an entropic dynamical Gaussian model
As a first example, I apply my IGAC to study the dynamics of a system with l degrees of freedom, each one describedby two pieces of relevant information, its mean expected value and its variance (Gaussian statistical macrostates).The line element ds = g µν (Θ) d Θ µ d Θ ν on M s is defined by, ds = l X k =1 (cid:18) σ k dµ k + 2 σ k dσ k (cid:19) , with µ , ν = 1,..., 2 l . (11)This leads to consider an ED model on a non-maximally symmetric 2 l -dimensional statistical manifold M s . It is shownthat M s possesses a constant negative Ricci curvature that is proportional to the number of degrees of freedom ofthe system, R M s = − l . It is shown that the system explores statistical volume elements on M s at an exponentialrate. The information geometrodynamical entropy S M s increases linearly in time (statistical evolution parameter)and is moreover, proportional to the number of degrees of freedom of the system, S M s τ →∞ ∼ lλτ . The parameter λ characterizes the family of probability distributions on M s . The asymptotic linear information-geometrodynamicalentropy growth may be considered the information-geometric analogue of the von Neumann entropy growth introducedby Zurek-Paz, a quantum feature of chaos. The geodesics on M s are hyperbolic trajectories. Using the Jacobi-Levi-Civita (JLC) equation for geodesic spread, I show that the Jacobi vector field intensity J M s diverges exponentially andis proportional to the number of degrees of freedom of the system, J M s τ →∞ ∼ l exp ( λτ ). The exponential divergenceof the Jacobi vector field intensity J M s is a classical feature of chaos. Therefore, we conclude that R M s = − l , J M s τ →∞ ∼ l exp ( λτ ) , S M s τ →∞ ∼ lλτ . (12)Thus, R M s , S M s and J M s behave as proper indicators of chaoticity and are proportional to the number of Gaussian-distributed microstates of the system. This proportionality, even though proven in a very special case, leads toconclude there may be a substantial link among these information-geometric indicators of chaoticity. B. Ensemble of random frequency macroscopic inverted harmonic oscillators
In our second example, I employ ED and ”Newtonian Entropic Dynamics” (NED) [9]. In this NED, we explorethe possibility of using well established principles of inference to derive Newtonian dynamics from relevant priorinformation codified into an appropriate statistical manifold. The basic assumption is that there is an irreducibleuncertainty in the location of particles so that the state of a particle is defined by a probability distribution. Thecorresponding configuration space is a statistical manifold M s the geometry of which is defined by the Fisher-Raoinformation metric. The trajectory follows from a principle of inference, the method of Maximum Entropy. There isno need for additional ”physical” postulates such as an action principle or equation of motion, nor for the conceptof mass, momentum and of phase space, not even the notion of time. The resulting ”entropic” dynamics reproducesNewton’s mechanics for any number of particles interacting among themselves and with external fields. Both themass of the particles and their interactions are explained as a consequence of the underlying statistical manifold.In my special application, I consider a manifold with a line element ds = g µν (Θ) d Θ µ d Θ ν (with µ , ν = 1,..., l )given by, ds = [1 − Φ (Θ)] δ µν (Θ) d Θ µ d Θ ν , Φ (Θ) = l X k =1 u k ( θ k ) (13)where u k ( θ k ) = − ω k θ k , θ k = θ k ( s ) . (14)The geodesic equations for the macrovariables θ k ( s ) are strongly nonlinear and their integration is not trivial. How-ever, upon a suitable change of the affine parameter s used in the geodesic equations, I may simplify the differentialequations for the macroscopic variables parametrizing points on the manifold M s with metric tensor g µν . Recallingthat the notion of chaos is observer-dependent and upon changing the affine parameter from s to τ in such a way that ds = 2 (1 − Φ) dτ , I obtain new geodesic equations describing a set of macroscopic inverted harmonic oscillators(IHOs). In order to ensure the compactification of the parameter space of the system (and therefore M s itself), wechoose a Gaussian distributed frequency spectrum for the IHOs. Thus, with this choice of frequency spectrum, thefolding mechanism required for true chaos is restored in a statistical (averaging over ω and τ ) sense. Upon integratingthese differential equations, I obtain the expression for the asymptotic behavior of the IGE S M s , namely S M s ( τ ) τ →∞ ∼ Λ τ , Λ = l X i =1 ω i . (15)This result may be considered the information-geometric analogue of the Zurek-Paz model used to investigate theimplications of decoherence for quantum chaos. In their work, Zurek and Paz considered a chaotic system, a singleunstable harmonic oscillator characterized by a potential V ( x ) = − Ω x (Ω is the Lyapunov exponent), coupled toan external environment. In the reversible classical limit [12], the von Neumann entropy of such a system increaseslinearly at a rate determined by the Lyapunov exponent, S (chaotic)quantum ( τ ) τ →∞ ∼ Ω τ , (16)with Ω playing the role of the Lyapunov exponent. C. Information geometrodynamics of regular and chaotic quantum spin chains
In my final example, I use my IGAC to study the entropic dynamics on curved statistical manifolds induced byclassical probability distributions of common use in the study of regular and chaotic quantum energy level statistics.Recall that the theory of quantum chaos (quantum mechanics of systems whose classical dynamics are chaotic) is notprimarily related to few-body physics. Indeed, in real physical systems such as many-electron atoms and heavy nuclei,the origin of complex behavior is the very strong interaction among many particles. To deal with such systems, afamous statistical approach has been developed which is based upon the Random Matrix Theory (RMT). The mainidea of this approach is to neglect the detailed description of the motion and to treat these systems statisticallybearing in mind that the interaction among particles is so complex and strong that generic properties are expected toemerge. Once again, this is exactly the philosophy underlining the ED approach to complex dynamics. It is known[13] that integrable and chaotic quantum antiferromagnetic Ising chains are characterized by asymptotic logarithmicand linear growths of their operator space entanglement entropies, respectively. In this last example, I consider theinformation-geometrodynamics of a Poisson distribution coupled to an Exponential bath (spin chain in a transverse magnetic field, regular case) and that of a Wigner-Dyson distribution coupled to a Gaussian bath (spin chain in a tilted magnetic field, chaotic case). Remarkably, I show that in the former case the IGE exhibits asymptotic logarithmicgrowth while in the latter case the IGE exhibits asymptotic linear growth. In the regular case, the line element ds = ds + ds on the statistical manifold M s is given by ds = 1 µ A dµ A + 1 µ B dµ B (17)where the macrovariable µ A is the average spacing of the energy levels and µ B is the average intensity of the magneticenergy arising from the interaction of the transverse magnetic field with the spin particle magnetic moment. Insuch a case, I show that the asymptotic behavior of S (integrable) M s is sub-linear in τ (logarithmic IGE growth), S (integrable) M s ( τ ) τ →∞ ∼ log τ . (18)Finally, in the chaotic case, the line element ds = ds + ds on the statistical manifold M s isgiven by ds = 4 µ ′ A dµ ′ A + 1 σ ′ B dµ ′ B + 2 σ ′ B dσ ′ B (19)where the (nonvanishing) macrovariable µ ′ A is the average spacing of the energy levels, µ ′ B and σ ′ B are the averageintensity and variance, respectively of the magnetic energy arising from the interaction of the tilted magnetic fieldwith the spin particle magnetic moment. In this case, I show that asymptotic behavior of S (chaotic) M s is linear in τ (linear IGE growth), S (chaotic) M s ( τ ) τ →∞ ∼ τ . (20)The equations for S (integrable) M s and S (chaotic) M s are the information-geometric analogue of the entanglement entropiesdefined in standard quantum information theory in the regular and chaotic cases, respectively. In addition, I emphasizethat the statistical volume element V M s ( τ ) (see (9)) may play a similar role as the computational complexity inconventional quantum information theory. These results warrant deeper analysis to be fully understood. IV. CONCLUSION
In this paper I proposed a theoretical information-geometric framework suitable to characterize chaotic dynamicalbehavior of arbitrary complex systems on curved statistical manifolds. Specifically, an information-geometric analogueof the Zurek-Paz quantum chaos criterion of linear entropy growth and an information-geometric characterization ofregular and chaotic quantum energy level statistics was presented. I hope that my work convincingly shows that thisinformation-geometric approach may be useful in providing a unifying criterion of chaos of both classical and quantumvarieties, thus deserving further research and developments.The descriptions of a classical chaotic system of arbitrary interacting degrees of freedom, deviations from Gaussianityand chaoticity arising from fluctuations of positively curved statistical manifolds are currently under investigation. Iam also investigating the possibility to extend the IGAC to quantum Hilbert spaces constructed from classical curvedstatistical manifolds and I am considering the information-geometric macroscopic versions of the Henon-Heiles andFermi-Pasta-Ulam β -models to study chaotic geodesic flows on statistical manifolds. Acknowledgments
I am grateful to Sean Alan Ali, Ariel Caticha and Adom Giffin for very useful comments, discussions and for theirprevious collaborations. I extend thanks to Cedric Beny, Michael Frey and Jeroen Wouters for their interest and/oruseful comments on my research during the NIC@QS07 in Erice, Ettore Majorana Centre. [1] M. Gell-Mann, ”What is Complexity”, Complexity, vol. 1, no. 1 (1995).[2] A. Caticha, ”Entropic Dynamics”, in
Bayesian Inference and Maximum Entropy Methods in Science and Engineering , ed.by R.L. Fry, AIP Conf. Proc. , 302 (2002).[3] A. Caticha and R. Preuss, ”Maximum entropy and Bayesian data analysis: Entropic prior distributions”, Phys. Rev.
E70 ,046127 (2004).[4] S. Amari and H. Nagaoka,
Methods of Information Geometry , American Mathematical Society, Oxford University Press,2000.[5] A. Caticha, ”Insufficient Reason and Entropy in Quantum Theory” Found. Phys. , 227 (2000).[6] C. Cafaro, S. A. Ali and A. Giffin, ”An Application of Reversible Entropic Dynamics on Curved Statistical Manifolds”,in Bayesian Inference and Maximum Entropy Methods in Science and Engineering, ed. by Ali Mohammad-Djafari, AIPConf. Proc. , 243-251 (2006).[7] C. Cafaro and S. A. Ali, ”Jacobi Fields on Statistical Manifolds of Negative Curvature”, Physica
D234 , 70-80 (2007). [8] C. Cafaro, ”Information Geometry and Chaos on Negatively Curved Statistical Manifolds”, in
Bayesian Inference andMaximum Entropy Methods in Science and Engineering, ed. by K. Knuth, et al ., AIP Conf. Proc. , 175 (2007).[9] A. Caticha and C. Cafaro, ”From Information Geometry to Newtonian Dynamics”, in Bayesian Inference and MaximumEntropy Methods in Science and Engineering, ed. by K. Knuth, et al ., AIP Conf. Proc. , 165 (2007).[10] W. H. Zurek and J. P. Paz, ”Decoherence, Chaos, and the Second Law”, Phys. Rev. Lett. , 2508 (1994); ”QuantumChaos: a decoherent definition”, Physica D83 , 300 (1995).[11] C. M. Caves and R. Schack, ”Unpredictability, Information, and Chaos”, Complexity , 46-57 (1997); A. J. Scott, T. A.Brun, C. M. Caves, and R. Schack, ”Hypersensitivity and chaos signatures in the quantum baker’s map”, J. Phys. A39 ,13405 (2006).[12] W. H. Zurek, ”Preferred States, Predictability, Classicality and Environment-Induced Decoherence”, Prog. Theor. Phys. , 281 (1993).[13] T. Prosen and M. Znidaric, ”Is the efficiency of classical simulations of quantum dynamics related to integrability?”, Phys.Rev. E75 , 015202 (2007); T. Prosen and I. Pizorn, ”Operator space entanglement entropy in transverse Ising chain”, Phys.Rev.