Can chaotic quantum energy levels statistics be characterized using information geometry and inference methods?
aa r X i v : . [ m a t h - ph ] O c t Can chaotic quantum energy levels statistics be characterized using informationgeometry and inference methods?
C. Cafaro ∗ and S. A. Ali † Department of Physics, State University of New York at Albany-SUNY,1400 Washington Avenue, Albany, NY 12222, USA
In this paper, we review our novel information geometrodynamical approach to chaos (IGAC) oncurved statistical manifolds and we emphasize the usefulness of our information-geometrodynamicalentropy (IGE) as an indicator of chaoticity in a simple application. Furthermore, knowing thatintegrable and chaotic quantum antiferromagnetic Ising chains are characterized by asymptoticlogarithmic and linear growths of their operator space entanglement entropies, respectively, weapply our IGAC to present an alternative characterization of such systems. Remarkably, we showthat in the former case the IGE exhibits asymptotic logarithmic growth while in the latter case theIGE exhibits asymptotic linear growth.At this stage of its development, IGAC remains an ambitious unifying information-geometrictheoretical construct for the study of chaotic dynamics with several unsolved problems. However,based on our recent findings, we believe it could provide an interesting, innovative and potentiallypowerful way to study and understand the very important and challenging problems of classical andquantum chaos.
PACS numbers: 02.50.Tt, 02.50.Cw, 02.40.-k, 05.45.-a, 05.45.Mt, 03.65.Ta
Keywords : Inductive inference, information geometry, statistical manifolds, entropy, chaos and entanglement.
I. INTRODUCTION
In classical and quantum dynamics there is no unified characterization of chaos. In the Riemannian [1] and Finslerian[2] (a Finsler metric is obtained from a Riemannian metric by relaxing the requirement that the metric be quadratic oneach tangent space) geometrodynamical approach to chaos in classical Hamiltonian systems, an active field of researchconcerns the possibility of finding a rigorous relation among the sectional curvature, the Lyapunov exponents, andthe Kolmogorov-Sinai dynamical entropy (i.e. the sum of positive Lyapunov exponents) [3]. The largest Lyapunovexponent characterizes the degree of chaoticity of a dynamical system and, if positive, it measures the mean instabilityrate of nearby trajectories averaged along a sufficiently long reference trajectory. Moreover, it is known that classicalchaotic systems are distinguished by their exponential sensitivity to initial conditions and that the absence of thisproperty in quantum systems has lead to a number of different criteria being proposed for quantum chaos. Exponentialdecay of fidelity, hypersensitivity to perturbation and the Zurek-Paz quantum chaos criterion of linear von Neumann’sentropy growth [4] are some examples [5]. These criteria accurately predict chaos in the classical limit, but it is notclear that they behave the same far from the classical realm.The present work makes use of the so-called Entropic Dynamics (ED) [6]. ED is a theoretical framework that arisesfrom the combination of inductive inference (Maximum relative Entropy Methods, [7]) and Information Geometry(Riemannian geometry applied to probability theory) (IG) [8]. As such, ED is constructed on statistical manifolds.It is developed to investigate the possibility that laws of physics - either classical or quantum - might reflect laws ofinference rather than laws of nature.This article is a follow up of a series of the authors works [9, 10, 11, 12, 13, 14, 15, 16]. Especially the workpresented in [15] will be discussed in more detail. The ED theoretical framework is used to explore the possibilityof constructing a unified characterization of classical and quantum chaos. The general formalism of the IGAC ispresented by investigating a system with 3 l degrees of freedom (microstates), each one described by two pieces ofrelevant information, its mean expected value and its variance (Gaussian statistical macrostates). This leads toconsider an ED model on a non-maximally symmetric 6 l -dimensional statistical manifold M s . It is shown that M s possesses a constant negative Ricci curvature that is proportional to the number of degrees of freedom of the system, R M s = − l . It is shown that the system explores statistical volume elements on M s at an exponential rate. Wedefine an information geometrodynamical entropy (IGE) S M s of the system and we show it increases linearly in time ∗ Electronic address: [email protected] † Electronic address: [email protected] (statistical evolution parameter) and is moreover proportional to the number of degrees of freedom of the system.The geodesics on M s are hyperbolic trajectories. Using the Jacobi-Levi-Civita (JLC) equation for geodesic spread,it is shown that the Jacobi vector field intensity J M s diverges exponentially and is proportional to the number ofdegrees of freedom of the system. Thus, R M s , S M s and J M s are proportional to the number of Gaussian-distributedmicrostates of the system. This proportionality leads to conclude there is a substantial link among these information-geometric indicators of chaoticity. We emphasize that our IGE provides an information-geometric analog of the Zurek-Paz quantum chaos criterion [10]. As a physical application of our general theoretical scheme that we have calledthe Information Geometrodynamical Approach to Chaos (IGAC), we provide an information-geometric analogue ofquantum energy level statistics for integrable and chaotic quantum spin chains. It is known [17] that in the integrablecase, the antiferromagnetic Ising chain is immersed in a transverse homogeneous magnetic field and the level spacingdistribution of its spectrum is the Poisson distribution. Instead, in the chaotic case, the antiferromagnetic Ising chainis immersed in a tilted homogeneous magnetic field and the level spacing distribution of its Hamiltonian spectrumis the Wigner-Dyson distribution. The antiferromagnetic Ising spin chain in external magnetic field is one exampleof order-to-chaos transition in quantum many-body context and it is used here as a demonstrating example of theconjectured connection between the Wigner-Dyson (Poisson) statistics and nonitegrability (integrability) in quantummechanics. Moreover, it is known that integrable and chaotic quantum antiferromagnetic Ising chains are characterizedby asymptotic logarithmic and linear growths of their operator space entanglement entropies [17], respectively.Following the results provided by Prosen, we study the information-geometrodynamics of a Poisson distributioncoupled to an Exponential bath (regular case) and that of a Wigner-Dyson distribution coupled to a Gaussian bath(chaotic case). Remarkably, we show that in the former case the IGE exhibits asymptotic logarithmic growth whilein the latter case it exhibits asymptotic linear growth.The layout of this paper is as follows. In Section II, the general formalism of the IGAC is applied to study asimple example, an ED Gaussian statistical model. In Section III, the main indicators of chaoticity within our noveltheoretical construct are introduced by studying the ED Gaussian model. In Section IV, special focus is devoted to therole of the IGE as an indicator of temporal complexity on curved statistical manifolds. In Section V, after presentingthe basics of the IG of Poisson and Wigner-Dyson distributions, we briefly review the conventional approach suitable tostudy the energy level statistics of integrable and chaotic antiferromagnetic Ising chains immersed in external magneticfields. In Section VI, we present our IGAC-based novel characterization of the quantum energy level statistics of suchIsing chains. Finally, in Section VII, we present our final remarks. II. THEORETICAL STRUCTURE OF THE IGAC: A SIMPLE EXAMPLE
The IGAC arises as a theoretical framework to study chaos in informational geodesic flows describing physical,biological or chemical systems. A geodesic on a curved statistical manifold represents the maximum probabilitypath a complex dynamical system explores in its evolution between the initial and the final macrostates. Eachpoint of the geodesic is parametrized by the macroscopic dynamical variables defining the macrostate of the system.Furthermore, each macrostate is in a one-to-one relation with the probability distribution representing the maximallyprobable description of the system being considered. The set of macrostates forms the parameter space while the setof probability distributions forms the statistical manifold. The parameter space is homeomorphic to the statisticalmanifold. IGAC is the information-geometric analogue of conventional geometrodynamical approaches [1, 2] wherethe classical configuration space Γ E is being replaced by a statistical manifold M S with the additional possibilityof considering chaotic dynamics arising from non conformally flat metrics (the Jacobi metric is always conformallyflat, instead). It is an information-geometric extension of the Jacobi geometrodynamics (the geometrization of aHamiltonian system by transforming it to a geodesic flow [18]). The reformulation of dynamics in terms of a geodesicproblem allows the application of a wide range of well-known geometrical techniques in the investigation of the solutionspace and properties of the equation of motion. The power of the Jacobi reformulation is that all of the dynamicalinformation is collected into a single geometric object in which all the available manifest symmetries are retained- themanifold on which geodesic flow is induced.Using information-geometric methods, we have investigated in some detail the still open problem of finding aunifying description of classical and quantum chaos [10]. One of our goals in this paper is that of representing anadditional step forward in that research direction. A. The ED Gaussian Model
Maximum relative Entropy (ME) methods are used to construct an ED model that follows from an assumption aboutwhat information is relevant to predict the evolution of the system. Given a known initial macrostate (probabilitydistribution) and that the system evolves to a final known macrostate, the possible trajectories of the system areexamined. A notion of distance between two probability distributions is provided by IG. As shown in [19, 20] thisdistance is quantified by the Fisher-Rao information metric tensor.In the following example, we consider an ED model whose microstates span a 3 l -dimensional space labelled by thevariables n ~X o = (cid:8) ~x (1) , ~x (2) ,...., ~x ( l ) (cid:9) with ~x ( α ) ≡ (cid:16) x ( α )1 , x ( α )2 , x ( α )3 (cid:17) , α = 1,...., l and x ( α ) a ∈ R with a = 1, 2, 3. Weassume the only testable information pertaining to the 3 l degrees of freedom n x ( α ) a o consists of the expectation values D x ( α ) a E and variances ∆ x ( α ) a ≡ s(cid:28)(cid:16) x ( α ) a − D x ( α ) a E(cid:17) (cid:29) . The set of these expectation values define the 6 l -dimensionalspace of macrostates of the system. A measure of distinguishability among the macrostates of the ED model isobtained by assigning a probability distribution P (cid:16) ~X | ~ Θ (cid:17) to each macrostate ~ Θ where n ~ Θ o = n (1) θ ( α ) a , (2) θ ( α ) a o with α = 1, 2, .... , l and a = 1, 2, 3. The process of assigning a probability distribution to each state endows M S with a metric structure. Specifically, the Fisher-Rao information metric is a measure of distinguishability amongmacrostates. It assigns an IG to the space of states. Each macrostate may be viewed as a point of a 6 l -dimensionalstatistical manifold with coordinates given by the numerical values of the expectations D x ( α ) a E = (1) θ ( α ) a and ∆ x ( α ) a ≡ s(cid:28)(cid:16) x ( α ) a − D x ( α ) a E(cid:17) (cid:29) = (2) θ ( α ) a . The available information can be written in the form of the following 6 l informationconstraint equations, D x ( α ) a E = + ∞ Z −∞ dx ( α ) a x ( α ) a P ( α ) a (cid:16) x ( α ) a (cid:12)(cid:12)(cid:12) (1) θ ( α ) a , (2) θ ( α ) a (cid:17) ,∆ x ( α ) a = + ∞ Z −∞ dx ( α ) a (cid:16) x ( α ) a − D x ( α ) a E(cid:17) P ( α ) a (cid:16) x ( α ) a (cid:12)(cid:12)(cid:12) (1) θ ( α ) a , (2) θ ( α ) a (cid:17) . (1)The probability distributions P ( α ) a in (1) are constrained by the conditions of normalization, + ∞ Z −∞ dx ( α ) a P ( α ) a (cid:16) x ( α ) a (cid:12)(cid:12)(cid:12) (1) θ ( α ) a , (2) θ ( α ) a (cid:17) = 1. (2)Information theory identifies the Gaussian distribution as the maximum entropy distribution if only the expectationvalue and the variance are known [21]. ME methods allow us to associate a probability distribution P (cid:16) ~X | ~ Θ (cid:17) toeach point in the space of states ~ Θ [7]. The distribution that best reflects the information contained in the priordistribution m (cid:16) ~X (cid:17) updated by the information (cid:16)D x ( α ) a E , ∆ x ( α ) a (cid:17) is obtained by maximizing the relative entropy S (cid:16) ~ Θ (cid:17) = − Z { ~X } d l ~XP (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) log P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) m (cid:16) ~X (cid:17) . (3)As a working hypothesis, the prior m (cid:16) ~X (cid:17) is set to be uniform since we assume the lack of prior available informationabout the system (postulate of equal a priori probabilities). Upon maximizing (3), given the constraints (1) and (2),we obtain P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) = l Y α =1 3 Y a =1 P ( α ) a (cid:16) x ( α ) a (cid:12)(cid:12)(cid:12) µ ( α ) a , σ ( α ) a (cid:17) (4)where P ( α ) a (cid:16) x ( α ) a (cid:12)(cid:12)(cid:12) µ ( α ) a , σ ( α ) a (cid:17) = (cid:18) π h σ ( α ) a i (cid:19) − exp − (cid:16) x ( α ) a − µ ( α ) a (cid:17) (cid:16) σ ( α ) a (cid:17) (5)and, in the standard notation for Gaussians, (1) θ ( α ) a def = D x ( α ) a E ≡ µ ( α ) a , (2) θ ( α ) a def = ∆ x ( α ) a ≡ σ ( α ) a . The probabilitydistribution (4) encodes the available information concerning the system. Note we assumed uncoupled constraintsamong microvariables x ( α ) a . In other words, we assumed that information about correlations between the microvariablesneed not to be tracked. This assumption leads to the simplified product rule (4). However, coupled constraints wouldlead to a generalized product rule in (4) and to an information metric tensor with non-trivial off-diagonal elements(covariance terms). For instance, the total probability distribution P (cid:0) x , y | µ x , σ x , µ y , σ y (cid:1) of two dependent Gaussiandistributed microvariables x and y reads P (cid:0) x , y | µ x , σ x , µ y , σ y (cid:1) = 12 πσ x σ y √ − r × (6) × exp ( −
12 (1 − r ) " ( x − µ x ) σ x − r ( x − µ x ) (cid:0) y − µ y (cid:1) σ x σ y + (cid:0) y − µ y (cid:1) σ y ,where r ∈ ( −
1, + 1) is the correlation coefficient given by r = h ( x − h x i ) ( y − h y i ) i p h x − h x ii p h y − h y ii = h xy i − h x i h y i σ x σ y . (7)The information metric tensor induced by (6) is [16], g ij = − σ x ( r − rσ x σ y ( r − − − r σ x ( r − r σ x σ y ( r − rσ x σ y ( r − − σ y ( r − r σ x σ y ( r − − − r σ y ( r − , (8)where i , j = 1, 2, 3, 4. The Ricci curvature scalar associated with manifold characterized by (8) is given by R = g ij R ij = − (cid:0) r − (cid:1) + 2 r (cid:0) r − (cid:1) r −
1) . (9)It is clear that in the limit r →
0, the off-diagonal elements of g ij vanish and the scalar R reduces to the result obtainedin [10], namely R = − <
0. We could have in principle considered a correlated Gaussian process characterized bytwo correlated Gaussian microvariables x and y . Such a process would lead to an ED on a five-dimensional statisticalmanifold whose elements are probability distributions of the form P (cid:0) x , y | µ x , σ x , µ y , σ y , σ xy (cid:1) coordinatized by theexpectation values µ x and µ y as well as the square-root of the three independent elements of the symmetric covariancematrix, namely, σ x , σ y and σ xy . However, in view of the computational difficulty in obtaining analytical expressionsfor the elements of a 5 × σ xy playing the role of a macro-dynamical variable, we have chosento consider the ED on a four-dimensional statistical manifold whose elements are given in (6). Correlation terms maybe fictitious. They may arise for instance from coordinate transformations. On the other hand, correlations mayarise from external fields in which the system is immersed. In such situations, correlations among x ( α ) a effectivelydescribe interaction between the microvariables and the external fields. Such generalizations would require moredelicate analysis.We cannot determine the evolution of microstates of the system since the available information is insufficient. Notonly is the information available insufficient but we also do not know the equation of motion. In fact there is nostandard ”equation of motion”. Instead we can ask: how close are the two total distributions with parameters ( µ ( α ) a , σ ( α ) a ) and ( µ ( α ) a + dµ ( α ) a , σ ( α ) a + dσ ( α ) a )? Once the states of the system have been defined, the next step concernsthe problem of quantifying the notion of change from the macrostate ~ Θ to the macrostate ~ Θ + d~ Θ. A convenientmeasure of change is distance. The measure we seek is given by the dimensionless distance ds between P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) and P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ + d~ Θ (cid:17) , ds = g µν d Θ µ d Θ ν with µ , ν = 1, 2,.., 6 l (10)where g µν = Z d ~XP (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) ∂ log P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) ∂ Θ µ ∂ log P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) ∂ Θ ν (11)is the Fisher-Rao information metric. Substituting (4) into (11), the metric g µν on M s becomes a 6 l × l matrix M made up of 3 l blocks M × with dimension 2 × M × = (cid:16) σ ( α ) a (cid:17) −
00 2 × (cid:16) σ ( α ) a (cid:17) − (12)with α = 1, 2, .... , l and a = 1 , ,
3. From (11), the ”length” element (10) reads, ds = l X α =1 3 X a =1 (cid:16) σ ( α ) a (cid:17) dµ ( α )2 a + 2 (cid:16) σ ( α ) a (cid:17) dσ ( α )2 a . (13)We bring attention to the fact that the metric structure of M s is an emergent (not fundamental) structure. It arisesonly after assigning a probability distribution P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) to each state ~ Θ. III. INFORMATION-GEOMETRIC INDICATORS OF CHAOS WITHIN THE IGAC
The relevant indicators of chaoticity within the IGAC are the Ricci scalar curvature R M s (or, more correctly, thesectional curvature K M S ), the Jacobi vector field intensity J M S and the IGE S M s once the line element on the curvedstatistical manifold M s underlying the entropic dynamics has been specified. A. Ricci Scalar Curvature, Anisotropy and Compactness
Given the Fisher-Rao information metric, we use standard differential geometry methods applied to the space ofprobability distributions to characterize the geometric properties of M s . Recall that the Ricci scalar curvature R isgiven by, R = g µν R µν , (14)where g µν g νρ = δ µρ so that g µν = ( g µν ) − . The Ricci tensor R µν is given by, R µν = ∂ γ Γ γµν − ∂ ν Γ λµλ + Γ γµν Γ ηγη − Γ ηµγ Γ γνη . (15)The Christoffel symbols Γ ρµν appearing in the Ricci tensor are defined in the standard manner as,Γ ρµν = 12 g ρσ ( ∂ µ g σν + ∂ ν g µσ − ∂ σ g µν ) . (16)Using (12) and the definitions given above, we can show that the Ricci scalar curvature becomes R M s = R αα = X ρ = σ K ( e ρ , e σ ) = − l <
0. (17)The scalar curvature is the sum of all sectional curvatures of planes spanned by pairs of orthonormal basis elements (cid:8) e ρ = ∂ Θ ρ ( p ) (cid:9) of the tangent space T p M s with p ∈ M s , K ( a , b ) = R µνρσ a µ b ν a ρ b σ ( g µσ g νρ − g µρ g νσ ) a µ b ν a ρ b σ , a = X ρ h a , h ρ i e ρ , (18)where h e ρ , h σ i = δ σρ . Notice that the sectional curvatures completely determine the curvature tensor. From (17)we conclude that M s is a 6 l -dimensional statistical manifold of constant negative Ricci scalar curvature. A detailedanalysis on the calculation of Christoffel connection coefficients using the ED formalism for a four-dimensional manifoldof Gaussians can be found in [10].It can be shown that M s is not a pseudosphere (maximally symmetric manifold). The first way this can beunderstood is from the fact that the Weyl Projective curvature tensor [22] (or the anisotropy tensor) W µνρσ definedby W µνρσ = R µνρσ − R M s n ( n −
1) ( g νσ g µρ − g νρ g µσ ) , (19)with n = 6 l in the present case, is non-vanishing. In (19), the quantity R µνρσ is the Riemann curvature tensor definedin the usual manner by R α βρσ = ∂ σ Γ αβρ − ∂ ρ Γ αβσ + Γ α λσ Γ λ βρ − Γ α λρ Γ λ βσ . (20)Considerations regarding the negativity of the Ricci curvature as a strong criterion of dynamical instability and thenecessity of compactness of M s in ”true” chaotic dynamical systems would require additional investigation.The issue of symmetry of M s can alternatively be understood from consideration of the sectional curvature. Inview of (18), the negativity of the Ricci scalar implies the existence of expanding directions in the configuration spacemanifold M s . Indeed, from (17) one may conclude that negative principal curvatures (extrema of sectional curvatures)dominate over positive ones. Thus, the negativity of the Ricci scalar is only a sufficient (not necessary) conditionfor local instability of geodesic flow. For this reason, the negativity of the scalar provides a strong criterion of localinstability. Scenarios may arise where negative sectional curvatures are present, but the positive ones could prevail inthe sum so that the Ricci scalar is non-negative despite the instability in the flow in those directions. Consequently,the signs of the sectional curvatures are of primary significance for the proper characterization of chaos.Yet another useful way to understand the anisotropy of the M s is the following. It is known that in n dimensions,there are at most n ( n +1)2 independent Killing vectors (directions of symmetry of the manifold). Since M s is not apseudosphere, the information metric tensor does not admit the maximum number of Killing vectors K ν defined as L K g µν = D µ K ν + D ν K µ = 0, (21)where D µ , defined as D µ K ν = ∂ µ K ν − Γ ρνµ K ρ (22)is the covariant derivative operator with respect to the connection Γ defined in (16). The Lie derivative L K g µν of thetensor field g µν along a given direction K measures the intrinsic variation of the field along that direction (that is,the metric tensor is Lie transported along the Killing vector) [23]. Locally, a maximally symmetric space of Euclideansignature is either a plane, a sphere, or a hyperboloid, depending on the sign of R . In our case, none of these scenariosoccur. As will be seen in what follows, this fact has a significant impact on the integration of the geodesic deviationequation on M s . At this juncture, we emphasize it is known that the anisotropy of the manifold underlying systemdynamics plays a crucial role in the mechanism of instability. In particular, fluctuating sectional curvatures requirealso that the manifold be anisotropic. However, the connection between curvature variations along geodesics andanisotropy is far from clear and is currently under investigation.Krylov was the first to emphasize [24] the use of R < N -bodysystem (a gas) interacting via Van der Waals forces, with the ultimate hope to understand the relaxation process in agas. However, Krylov neglected the problem of compactness of the configuration space manifold which is importantfor making inferences about exponential mixing of geodesic flows [25]. Why is compactness so significant in thecharacterization of chaos? True chaos should be identified by the occurrence of two crucial features: 1) strongdependence on initial conditions and exponential divergence of the Jacobi vector field intensity, i.e., stretching ofdynamical trajectories; 2) compactness of the configuration space manifold, i.e., folding of dynamical trajectories.Compactness [2, 26] is required in order to discard trivial exponential growths due to the unboundedness of the”volume” available to the dynamical system. In other words, the folding is necessary to have a dynamics actuallyable to mix the trajectories, making practically impossible, after a finite interval of time, to discriminate betweentrajectories which were very nearby each other at the initial time. When the space is not compact, even in presenceof strong dependence on initial conditions, it could be possible in some instances (though not always), to distinguishamong different trajectories originating within a small distance and then evolved subject to exponential instability.As a final remark, we emphasize that it is known from IG [8] that there is a one-to-one relation between elementsof the statistical manifold and the parameter space. More precisely, the statistical manifold M s is homeomorphic tothe parameter space D Θ . This implies the existence of a continuous, bijective map h M s , D Θ , h M s , D Θ : M S ∋ P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) → ~ Θ ∈ D Θ (23)where h − M s , D Θ (cid:16) ~ Θ (cid:17) = P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17) . The inverse image h − M s , D Θ is the so-called homeomorphism map. In addi-tion, since homeomorphisms preserve compactness, it is sufficient to restrict ourselves to a compact subspace of theparameter space D Θ in order to ensure that M S is itself compact. B. Canonical Formalism
The geometrization of a Hamiltonian system by transforming it to a geodesic flow is a well-known technique ofclassical mechanics associated with the name of Jacobi [18]. Transformation to geodesic motion is obtained in twosteps: 1) conformal transformation of the metric; 2) rescaling of the time parameter [27]. The reformulation ofdynamics in terms of a geodesic problem allows the application of a wide range of well-known geometrical techniquesin the investigation of the solution space and properties of equations of motions. The power of the Jacobi reformulationis that all of the dynamical information is collected into a single geometric object - the manifold on which geodesicflow is induced - in which all the available manifest symmetries are retained. For instance, integrability of the systemis connected with the existence of Killing vectors and tensors on this manifold [28, 29].In this Section we study the trajectories of the system on M s . We emphasize ED can be derived from a standardprinciple of least action (of Maupertuis-Euler-Lagrange-Jacobi type) [6, 30]. The main differences are that thedynamics being considered here, namely ED, is defined on a space of probability distributions M s , not on an ordinarylinear space V and the standard coordinates q µ of the system are replaced by statistical macrovariables Θ µ . Thegeodesic equations for the macrovariables of the Gaussian ED model are given by, d Θ µ dτ + Γ µνρ d Θ ν dτ d Θ ρ dτ = 0 (24)with µ = 1, 2,..., 6 l . Observe the geodesic equations are nonlinear second order coupled ordinary differential equations.They describe a reversible dynamics whose solution is the trajectory between an initial and a final macrostate. Thetrajectory can be equally well traversed in both directions.
1. Geodesics on M s We determine the explicit form of (24) for the pairs of statistical coordinates ( µ ( α ) a , σ ( α ) a ). Substituting the expres-sion of the Christoffel connection coefficients into (24), the geodesic equations for the macrovariables µ ( α ) a and σ ( α ) a associated to the microstate x ( α ) a become, d µ ( α ) a dτ − σ ( α ) a dµ ( α ) a dτ dσ ( α ) a dτ = 0, d σ ( α ) a dτ − σ ( α ) a dσ ( α ) a dτ ! + 12 σ ( α ) a dµ ( α ) a dτ ! = 0, (25)with α = 1, 2, .... , l and a = 1, 2, 3. This is a set of coupled ordinary differential equations, whose solutions are µ ( α ) a ( τ ) = ( B ( α ) a ) β ( α ) a cosh “ β ( α ) a τ ” − sinh “ β ( α ) a τ ” + ( B ( α ) a ) ( β ( α ) a ) , σ ( α ) a ( τ ) = B ( α ) a exp ( − β ( α ) a τ ) exp “ − β ( α ) a τ ” + ( B ( α ) a ) ( β ( α ) a ) + C ( α ) a . (26)The quantities B ( α ) a , C ( α ) a , β ( α ) a are real integration constants and they can be evaluated once the boundary conditionsare specified. We observe that since every geodesic is well-defined for all temporal parameters τ , M s constitutes ageodesically complete manifold [31]. It is therefore a natural setting within which one may consider global questionsand search for a weak criterion of chaos [2]. Furthermore, since (cid:12)(cid:12)(cid:12) µ ( α ) a ( τ ) (cid:12)(cid:12)(cid:12) < + ∞ and (cid:12)(cid:12)(cid:12) σ ( α ) a ( τ ) (cid:12)(cid:12)(cid:12) < + ∞ ∀ τ ∈ R + , ∀ a = 1, 2, 3 and ∀ α = 1,.., N , the parameter space n ~ Θ o (homeomorphic to M s ) is compact. The compactness of theconfiguration space manifold M s assures the folding mechanism of information-dynamical trajectories (the foldingmechanism is a key-feature of true chaos, [2] ). It is known [30] that the Riemannian curvature of a manifold is intimately related to the behavior of geodesics on it.If the Riemannian curvature of a manifold is negative, geodesics (initially parallel) rapidly diverge from one another.For the sake of simplicity, we assume very special initial conditions: B ( α ) a ≡ Ξ, β ( α ) a ≡ λ ∈ R + , C ( α ) a = 0, ∀ α = 1,2, .... , l and a = 1, 2, 3. However, the conclusions drawn can be generalized to more arbitrary initial conditions. Weobserve that since every maximal geodesic is well-defined for all temporal parameters τ , M s constitute a geodesicallycomplete manifold [31]. It is therefore a natural setting within which one may consider global questions and searchfor a weak criterion of chaos [2]. C. Exponential divergence of the Jacobi field intensity
The actual interest of the Riemannian formulation of the dynamics stems form the possibility of studying theinstability of natural motions through the instability of geodesics of a suitable manifold, a circumstance that hasseveral advantages. First of all a powerful mathematical tool exists to investigate the stability or instability of ageodesic flow: the Jacobi-Levi-Civita equation for geodesic spread [32]. The JLC-equation describes covariantly hownearby geodesics locally scatter. It is a familiar object both in Riemannian geometry and theoretical physics (itis of fundamental interest in experimental General Relativity). Moreover the JLC-equation relates the stability orinstability of a geodesic flow with curvature properties of the ambient manifold, thus opening a wide and largelyunexplored field of investigation of the connections among geometry, topology and geodesic instability, hence chaos.Consider the behavior of the one-parameter family of neighboring geodesics F G M s ( λ ) ≡ (cid:8) Θ µ M s ( τ ; λ ) (cid:9) µ =1,.., 6 lλ ∈ R + where µ ( α ) a ( τ ; λ ) = Ξ λ λτ ) − sinh (2 λτ ) + Ξ λ , (27) σ ( α ) a ( τ ; λ ) = Ξ cosh ( λτ ) − sinh ( λτ )cosh (2 λτ ) − sinh (2 λτ ) + Ξ λ .with α = 1, 2, .... , l and a = 1, 2, 3. The relative geodesic spread on a (non-maximally symmetric) curved manifold as M s is characterized by the Jacobi-Levi-Civita equation, the natural tool to tackle dynamical chaos [23, 32], D δ Θ µ Dτ + R µνρσ ∂ Θ ν ∂τ δ Θ ρ ∂ Θ σ ∂τ = 0 (28)where the covariant derivative D δ Θ µ Dτ in (28) is defined as [33], D δ Θ µ Dτ = d δ Θ µ dτ + 2Γ µαβ dδ Θ α dτ d Θ β dτ + Γ µαβ δ Θ α d Θ β dτ + Γ µαβ,ν d Θ ν dτ d Θ β dτ δ Θ α ++Γ µαβ Γ αρσ d Θ σ dτ d Θ β dτ δ Θ ρ , (29)and the Jacobi vector field J µ is given by [34], J µ ≡ δ Θ µ def = δ λ Θ µ = (cid:18) ∂ Θ µ ( τ ; λ ) ∂λ (cid:19)(cid:12)(cid:12)(cid:12)(cid:12) τ =const δλ . (30)Notice that the JLC-equation appears intractable already at rather small l . For isotropic manifolds, the JLC-equationcan be reduced to the simple form [32], D J µ Dτ + KJ µ = 0, µ = 1,...., 6 l (31)where K is the constant value assumed throughout the manifold by the sectional curvature. The sectional curvatureof manifold M s is the 6 l -dimensional generalization of the Gaussian curvature of two-dimensional surfaces of R . If K <
0, unstable solutions of the equation (31) are of the form J ( τ ) = 1 √− K ω (0) sinh (cid:16) √− Kτ (cid:17) (32)once the initial conditions are assigned as J (0) = 0, dJ (0) dτ = ω (0) and K <
0. Equation (28) forms a system of 6 l coupled ordinary differential equations linear in the components of the deviation vector field (30) but nonlinear inderivatives of the metric (11). It describes the linearized geodesic flow: the linearization ignores the relative velocity ofthe geodesics. When the geodesics are neighboring but their relative velocity is arbitrary, the corresponding nonlineargeodesic deviation equation is the so-called generalized Jacobi equation [35, 36]. The nonlinearity is due to theexistence of velocity-dependent terms in the system. Neighboring geodesics accelerate relative to each other with arate directly measured by the curvature tensor R αβγδ . Substituting (27) in (28) and neglecting the exponentiallydecaying terms in δ Θ µ and its derivatives, integration of (28) leads to the following asymptotic exponential growthof the Jacobi vector field intensity (a classical feature of chaos), J M S = k J k = ( g µν J µ J ν ) τ →∞ ≈ le λτ . (33)Finally, we point out that in our approach the quantity λ J , λ J def = lim τ →∞ τ ln (cid:13)(cid:13)(cid:13) J M S ( τ ) (cid:13)(cid:13)(cid:13)(cid:13)(cid:13)(cid:13) J M S (0) (cid:13)(cid:13)(cid:13) (34)would play the role of the conventional Lyapunov exponents. IV. LINEARITY OF THE INFORMATION GEOMETRODYNAMICAL ENTROPY
We investigate the stability of the trajectories of the ED model considered on M s . It is known [30] that theRiemannian curvature of a manifold is closely connected with the behavior of the geodesics on it. If the Riemanniancurvature of a manifold is negative, geodesics (initially parallel) rapidly diverge from one another. For the sake ofsimplicity, we assume very special initial conditions: B ( α ) a ≡ Ξ, β ( α ) a ≡ λ ∈ R + , C ( α ) a = 0, ∀ α = 1, 2, .... , l and a = 1, 2, 3 . However, the conclusion we reach can be generalized to more arbitrary initial conditions. Recall M s isthe space of probability distributions n P (cid:16) ~X (cid:12)(cid:12)(cid:12) ~ Θ (cid:17)o labeled by 6 l statistical parameters ~ Θ. These parameters are thecoordinates for the point P , and in these coordinates a volume element dV M s reads, dV M S = √ gd l ~ Θ = l Y α =1 3 Y a =1 √ (cid:16) σ ( α ) a (cid:17) dµ ( α ) a dσ ( α ) a . (35)The volume of an extended region ∆ V M s ( τ ; λ ) of M s is defined by,∆ V M s ( τ ; λ ) def = l Y α =1 3 Y a =1 Z µ ( α ) a ( τ ) µ ( α ) a (0) Z σ ( α ) a ( τ ) σ ( α ) a (0) √ (cid:16) σ ( α ) a (cid:17) dµ ( α ) a dσ ( α ) a (36)where µ ( α ) a ( τ ) and σ ( α ) a ( τ ) are given in (27) and where the scalar λ is the chosen quantity used to define the one-parameter family of geodesics F G M s ( λ ) def = (cid:8) Θ µ M s ( τ ; λ ) (cid:9) µ =1,..,6 lλ ∈ R + . The quantity that encodes relevant informationabout the stability of neighboring volume elements is the the average volume V M s ( τ ; λ ), V M s ( τ ; λ ) ≡ h ∆ V M s ( τ ′ ; λ ) i τ def = 1 τ τ Z ∆ V M s ( τ ′ ; λ ) dτ ′ τ →∞ ≈ e lλτ . (37)The ratio V M s ( τ ) V M s (0) with V M s ( τ ) in (37) representing the temporal average of the 3 l - fold integral over trajectories ofmaximum probability (geodesics) is a measure of the number of the accessible macrostates in configuration (statistical)manifold M s after a finite temporal increment τ . In other words, V M s ( τ ) can be interpreted as the temporal evolutionof the system’s uncertainty volume V M s (0). For instance V M s (0) may be a spherical volume of initial points whosecenter is a given point on the attractor and whose surface consists of configuration points from nearby trajectories. Anattractor is a subset of the manifold M s toward which almost all sufficiently close trajectories converge asymptotically,covering it densely as the time goes on. Strange attractors are called chaotic attractors. Chaotic attractors have atleast one finite positive Lyapunov exponent [37]. As the center of V M s (0) and its surface points evolve in time, the0spherical volume becomes an ellipsoid with principal axes in the directions of contraction and expansion. The averagerates of expansion and contraction along the principal axes are the Lyapunov exponents [38].The asymptotic regime of diffusive evolution in (37) describes the exponential increase of average volume elementson M s . The exponential instability characteristic of chaos forces the system to rapidly explore large areas (volumes)of the statistical manifolds. From equation (37), we notice that the parameter λ characterizes the exponential growthrate of average statistical volumes V M s ( τ ; λ ) in M s . This suggests that λ may play the same role ordinarily playedby Lyapunov exponents [39]. It is interesting to note that this asymptotic behavior appears also in the conventionaldescription of quantum chaos where the von Neumann entropy increases linearly at a rate determined by the Lyapunovexponents. The linear increase of entropy as a quantum chaos criterion was introduced by Zurek and Paz [4]. In ourinformation-geometric approach a relevant quantity that can be useful to study the degree of instability characterizingthe ED model is the information-geometrodynamical entropy (IGE) defined as [10], S M s def = lim τ →∞ log V M s ( τ ; λ ) . (38)The IGE is intended to capture the temporal complexity (chaoticity) of ED theoretical models on curved statisticalmanifolds M s by considering the asymptotic temporal behaviors of the average statistical volumes occupied by theevolving macrovariables labelling points on M s . Substituting (37) in (38), we obtain S M s = lim τ →∞ log τ τ Z l Y α =1 3 Y a =1 Z µ ( α ) a ( τ ′ ) µ ( α ) a (0) Z σ ( α ) a ( τ ′ ) σ ( α ) a (0) √ (cid:16) σ ( α ) a (cid:17) dµ ( α ) a dσ ( α ) a dτ ′ τ →∞ ≈ lλτ . (39)Before discussing the meaning of (39), recall that in conventional approaches to chaos the notion of entropy isintroduced, in both classical and quantum physics, as the missing information about the systems fine-grained state[5, 40]. For a classical system, suppose that the phase space is partitioned into very fine-grained cells of uniformvolume ∆ v , labelled by an index j . If one does not know which cell the system occupies, one assigns probabilities p j to the various cells; equivalently, in the limit of infinitesimal cells, one can use a phase-space density ρ ( X j ) = p j ∆ v .Then, in a classical chaotic evolution, the asymptotic expression of the information needed to characterize a particularcoarse-grained trajectory out to time τ is given by the Shannon information entropy (measured in bits), S (chaotic)classical = − Z dXρ ( X ) log ( ρ ( X ) ∆ v ) = − X j p j log p j ∼ K τ . (40)where ρ ( X ) is the phase-space density and p j = v j ∆ v is the probability for the corresponding coarse-grained trajectory. S (chaotic)classical is the missing information about which fine-grained cell the system occupies. The quantity K representsthe linear rate of information increase and it is called the Kolmogorov-Sinai entropy (or metric entropy) ( K is thesum of positive Lyapunov exponents, K = P j λ j ). K quantifies the degree of classical chaos. The Kolmogorov-Sinai entropy provides a measure of the rate at which information is lost by an evolving chaotic system ( K hasdimension entropy/time) and has its roots in the definition of the Shannon entropy. It is worthwhile emphasizingthat the quantity that grows asymptotically as K τ is really the average of the information on the left side of equation(40). This distinction can however be ignored provided we assume the chaotic system has roughly constant Lyapunovexponents over the accessible region of phase space. In quantum mechanics the fine-grained alternatives are normalizedstate vectors in Hilbert space. From a set of probabilities for various state vectors, one can construct a density operator b ρ = X j λ j (cid:12)(cid:12) ψ j (cid:11) (cid:10) ψ j (cid:12)(cid:12) , b ρ (cid:12)(cid:12) ψ j (cid:11) = λ j (cid:12)(cid:12) ψ j (cid:11) . (41)The normalization of the density operator, tr ( b ρ ) = 1, implies that the eigenvalues make up a normalized probabilitydistribution. The von Neumann entropy (natural generalization of both Boltzmann’s and Shannon’s entropy) of thedensity operator b ρ (measured in bits) [41], S (chaotic)quantum = − tr ( b ρ log b ρ ) = − X j λ j log λ j ∼ K q τ (42)can be thought of as the missing information about which eigenvector the system is in. Entropy quantifies the degreeof unpredictability about the system’s fine-grained state. In quantum mechanics, the von Neumann entropy plays arole analogous to that played by the Shannon entropy in classical probability theory. They are both functionals ofthe state, are both monotone under a relevant kind of mapping, and can both be singled out uniquely by natural1requirements. von Neumann’s entropy reduces to the Shannon entropy for diagonal density matrices. However, ingeneral the von Neumann entropy is a subtler object than its classical counterpart. The quantity K q in (42) can beinterpreted as the non-commutative (quantum theory is a non-commutative probability theory) quantum analog of theKolmogorov-Sinai dynamical entropy, the so-called quantum dynamical entropy [43]. Examples of quantum dynamicalentropies applied to quantum chaos and quantum information theory are the Alicki-Fannes (AF) [44] entropy andthe Connes-Narnhofer-Thirring (CNT) [45] entropy. Both the AF and CNT entropy coincide with the KS entropy onclassical dynamical systems. They also coincide on finite-dimensional quantum systems. However, they differ whenmoving from finite to infinite quantum systems.Recall that decoherence is the loss of phase coherence between the set of preferred quantum states in the Hilbertspace of the system due to the interaction with the environment. Moreover, decoherence induces transitions fromquantum to classical systems. Therefore, classicality is an emergent property of an open quantum system. Motivatedby such considerations, Zurek and Paz investigated implications of the process of decoherence for quantum chaos.They considered a chaotic system, a single unstable harmonic oscillator characterized by a potential V ( x ) = − λx ( λ is the Lyapunov exponent), coupled to an external environment. In the reversible classical limit [42], the vonNeumann entropy of such a system increases linearly at a rate determined by the Lyapunov exponent, S (chaotic)quantum (Zurek-Paz) τ →∞ ∼ λτ . (43)Notice that the consideration of 3 l uncoupled identical unstable harmonic oscillators characterized by potentials V i ( x ) = − λ i x ( λ i = λ j ; i , j = 1, 2,..., 3 l ) would simply lead to S (chaotic)quantum (Zurek-Paz) τ →∞ ∼ lλτ . (44)The resemblance of equations (39) and (44) is remarkable and a more detailed discussion about this analogy ispresented in [16] where an information-geometric analogue of the Zurek-Paz quantum chaos criterion in the classicalreversible limit is proposed [4]. This analogy is illustrated applying the IGAC to a set of n -uncoupled three-dimensionalanisotropic inverted harmonic oscillators characterized by a Ohmic distributed frequency spectrum.The entropy-like quantity S M s in (39) is the asymptotic limit of the natural logarithm of the statistical weight h ∆ V M s i τ defined on M s and it grows linearly in time, a quantum feature of chaos. Indeed, equation (39) may beconsidered the information-geometric analog of the Zurek-Paz chaos criterion. In our chaotic ED Gaussian model, theIGE production rate is determined by the information-geometric parameter λ characterizing the exponential growthrate of average statistical volumes V M s ( τ ; λ ) in M s .In conclusion, for the example under investigation, we have R M s = − l , S M s τ →∞ ≈ lλτ , J M S τ →∞ ≈ le λτ . (45)The IGE grows linearly as a function of the number of Gaussian-distributed microstates of the system. This supportsthe fact that S M s may be a useful measure of temporal complexity [46]. Furthermore, these three indicators of chaotic-ity, the Ricci scalar curvature R M s , the information-geometric entropy S M s and the Jacobi vector field intensity J M S are proportional to 3 l , the dimension of the microspace with microstates n ~X o underlying our chaotic ED Gaussianmodel. This proportionality leads to the conclusion that there is a substantial link among these information-geometricmeasures of chaoticity since they are all extensive functions of the dimensionality of the microspace underlying themacroscopic chaotic entropic dynamics (see (45)). Curvature, information-geometrodynamical entropy and Jacobifield intensity are linked within our formalism. We are aware that our findings are reliable in the restrictive assump-tion of Gaussianity. However, we believe that with some additional technical machinery, more general conclusions canbe achieved and this connection among indicators of chaoticity may be strengthened. V. INFORMATION GEOMETRY OF QUANTUM ENERGY LEVEL STATISTICS: AN APPLICATIONTO ENTANGLEMENT IN QUANTUM SPIN CHAINS
In what follows, we apply the IGAC to study the entropic dynamics on curved statistical manifolds induced byclassical probability distributions of common use in the study of regular and chaotic quantum energy level statistics.In doing so, we suggest an information-geometric characterization of a special class of regular and chaotic quantumenergy level statistics. More precisely, we present an information-geometric analogue of the logarithmic and linearentanglement entropy growth in regular and quantum chaotic spin chains, respectively.2
A. The Information Geometry of the Poisson and Wigner-Dyson Distributions
The theory of quantum chaos (quantum mechanics of systems whose classical dynamics are chaotic) is not primarilyrelated to few-body physics. Indeed, in real physical systems such as many-electron atoms and heavy nuclei, theorigin of complex behavior is the quite strong interaction among many particles. To deal with such systems, a famousstatistical approach has been developed which is based upon the Random Matrix Theory (RMT) [47]. The mainidea of this approach is to neglect the detailed description of the motion and to treat these systems statisticallybearing in mind that the interaction among particles is so complex and strong that generic properties are expectedto emerge. The simplest models of RMT are full random matrices of a given symmetry. One of the main results ofRMT is the prediction of a specific kind of correlations of the energy spectra of complex quantum systems. Amongmany characteristics of these correlations, the most popular one is the distribution of spacings between nearest energylevels in the spectra. The exact analytical expression of this distribution is very complicated; instead, one usesthe so-called Wigner-Dyson surmise (a very simple expression which gives a very good approximation to the exactresult). The known manifestation of quantum chaos is the so-called Wigner-Dyson (WD) distribution for spacingsbetween neighboring levels in the spectrum. In the other limiting case of completely integrable (regular) systems, thedistribution turns out to be very close to the Poissonian one. A distinctive property of the WD distribution is therepulsion between neighboring levels in the spectra; the degree of this repulsion (linear, quadratic or quartic) dependson the symmetry of random matrices. For systems without time reversal invariance the relevant ensemble of randommatrices is the Gaussian Unitary Ensemble (GUE) [47], characterized by the probability distribution p GUE ( θ ) = 32 π θ exp (cid:18) − π θ (cid:19) , (quadratic repulsion) (46)where θ is the average spacing of the energy levels. For systems invariant with respect to time reversal the ensembleis the Gaussian Orthogonal Ensemble (GOE) [47], p GOE ( θ ) = π θ exp (cid:16) − π θ (cid:17) , (linear repulsion). (47)For systems with time reversal invariance but with half-integer spin, the energy is described by the Gaussian SymplecticEnsemble (GSE) of random matrices [47], p GSE ( θ ) = 2 π θ exp (cid:18) − π θ (cid:19) , (quartic repulsion). (48)Equations (46), (47) and (48) are standard accepted conjectures. Besides energy level statistics in the extremeintegrable (Poisson) and chaotic (Wigner-Dyson) regimes, there is also energy level statistics in the mixed regime, i.e.,such having a mixed classical dynamics where regular and chaotic regions coexist in the phase space. A convenientand often successful parametrization of the correct probability distribution in the transition region between Poissonand WD distributions is provided by the Brody interpolation formula [48], p (Brody) β ( θ ) = γ ( β + 1) exp (cid:16) − γθ β +1 (cid:17) . (49)where γ = n Γ h β +2 β +1 io β +1 and Γ ( β ) is the Euler Gamma function. This distribution is normalized and, by construc-tion, has mean spacing h θ i = 1. We recover the Poisson case by taking β = 0 while the Wigner case is recovered for β = 1. However, a criticism of the Brody distribution is the lack of a first principles justification for its validity. Thefact remains that it does fit the specific results found when considering explicit model systems. It is essentially anad hoc one-parameter family of distributions and has no deep physical background, but it does interpolate betweenPoisson and Wigner-Dyson in a simple, effective manner. Our objective here is to apply our information-geometricformalism (based on statistical inference methods) to Wigner-Dyson and Poisson probability distributions.Most of the probability distributions arise from the maximum entropy formalism as a result of some simple state-ments concerning averages. Not all distribution are generated in this way. Some distributions are generated bycombining the results of simple cases (multinomial from a binomial). Other distributions are found as a result of achange of variable (Cauchy distribution). For instance, the Weibull distribution [49] can be obtained from an expo-nential distribution as a result of a power law transformation. Assume our knowledge of the microstate x is encodedin an exponential distribution, p ( x | θ ) = 1 θ e − xθ , (50)3where x may be considered the spacing of the energy levels while θ is the average spacing, θ = h x i . Note that the studyof probability distributions could, in principle, be restricted to the exponential type since an arbitrary distributioncan be represented in exponential form. It is said that the exponential family of distributions is dense in the totality ofprobability distributions [50]. We can re-express x ∈ X in p ( x | θ ) in terms of another random variable y = f ( x ) ∈ Y ,assuming f is an invertible mapping. For instance, consider the power law transformation x → y = f ( x ) = (cid:18) xζ (cid:19) n . (51)We clearly have, p old ( x ) → ˆ p new ( y ) = Z X dxp old ( x ) δ ( y − f ( x ))= Z X dxp old ( x ) 1 (cid:12)(cid:12)(cid:12) ∂f∂x (cid:12)(cid:12)(cid:12) δ (cid:0) f − ( y ) − x (cid:1) = (cid:12)(cid:12)(cid:12) ∂f∂x (cid:12)(cid:12)(cid:12) p old ( x ) x = f − ( y ) . (52)Therefore, considering (50) and (51), equation (52) leads toˆ p new ( y ) = n ζθ e − ζθ y n y n − . (53)It is worthwhile emphasizing that since (cid:12)(cid:12)(cid:12) ∂f∂x (cid:12)(cid:12)(cid:12) does not depend on θ and since R Y dy = R X dx (cid:12)(cid:12)(cid:12) ∂f∂x (cid:12)(cid:12)(cid:12) , we have Z Y dy ˆ p new ( y ) ∂ µ log ˆ p new ( y ) ∂ ν log ˆ p new ( y ) = Z X dyp old ( x ) ∂ µ log p old ( x ) ∂ ν log p old ( x ) . (54)Equation (54) leads to conclude that the Fisher-Rao information metric g µν is invariant under transformations ofthe random variable. For the sake of completeness, let us show that the information metric is also covariant underreparametrization. Suppose that (cid:16) ˆ θ µ (cid:17) is a new set of coordinates, specified in terms of the old set through theinvertible relationship ˆ θ µ = ˆ θ µ ( θ ). Defining ˆ p ˆ θ ( x ) ≡ p θ ( ˆ θ ) ( x ), we are then able to compute the new metric tensorˆ g µν (cid:16) ˆ θ (cid:17) in terms of g µν ( θ ). Indeed, since ∂∂ ˆ θ µ ˆ p ˆ θ = ∂θ ν ∂ ˆ θ µ ∂∂θ ν p θ ( ˆ θ ) , we obtainˆ g µν (cid:16) ˆ θ (cid:17) = (cid:20) ∂θ ρ ∂ ˆ θ µ ∂θ σ ∂ ˆ θ ν g ρσ ( θ ) (cid:21) θ = θ ( ˆ θ ) . (55)Letting ζθ = n , from (53) we obtain the Weibull probability distribution, p Weibull ( y | Λ) = n Λ (cid:16) y Λ (cid:17) n − e − ( y Λ ) n , Λ = (cid:18) θζ (cid:19) n . (56)Moreover, letting n = 2, y = ∆ and Λ = D √ π , from (56) we obtain the standard Wigner-Dyson distribution, p Wigner-Dyson (∆ | D ) = π ∆2 D e − π ∆24 D , D = √ π (cid:18) θζ (cid:19) . (57)In conventional notations, ∆ is the spacing between two neighboring energy levels and D is the average spacing [51].Recall that the Fisher-Rao information metric G (P) µν ( θ ) of a Poissonian probability distribution p ( x | θ ) is defined as, G (P) µν ( θ ) = Z dxp ( x | θ ) ∂ µ log p ( x | θ ) ∂ ν log p ( x | θ ) with ∂ µ = ∂∂θ µ (58)where p ( x | θ ) is given by, p ( x | θ ) = 1 θ exp (cid:16) − xθ (cid:17) . (59)4The Poisson line element (cid:0) ds (cid:1) Poisson is defined as, (cid:0) ds (cid:1) Poisson = G (P) µν ( θ ) dθ µ dθ ν = 1 θ dθ . (60)The Fisher-Rao information metric G (WD) µν ( φ ) of a Wigner-Dyson probability distribution q ( y | φ ) is defined as, G (WD) µν ( φ ) = Z dyq ( y | φ ) ∂ µ log q ( y | φ ) ∂ ν log q ( y | φ ) with ∂ µ = ∂∂φ µ (61)where q ( y | φ ) is given by, q ( y | φ ) = πy φ exp (cid:18) − πy φ (cid:19) , φ = √ π (cid:18) θλ (cid:19) . (62)Notice that q ( y | φ ) in (62) is equivalent to p Wigner-Dyson (∆ | D ) in (57) with y = ∆ and φ = D . The Wigner-Dysonline element (cid:0) ds (cid:1) Wigner-Dyson is defined as, (cid:0) ds (cid:1) Wigner-Dyson = G (WD) µν ( φ ) dφ µ dφ ν . (63)Notice that the Poisson distribution and the Wigner-Dyson distributions are related through the combination of achange of random variable and a new reparametrization, namely q ( y | φ ) = p ( x ( y ) | θ ( φ )) J ( y ) (64)where, x ( y ) = λy , θ ( φ ) = 4 φ π λ , J ( y ) = (cid:12)(cid:12)(cid:12)(cid:12) ∂x ( y ) ∂y (cid:12)(cid:12)(cid:12)(cid:12) . (65)Considering equations (54) and (55), the Wigner-Dyson line element (cid:0) ds (cid:1) Wigner-Dyson becomes (cid:0) ds (cid:1) Wigner-Dyson = G (WD) µν ( φ ) dφ µ dφ ν = G (P) µν ( θ ( φ )) dθ ( φ ) µ dθ ( φ ) ν = 4 φ dφ . (66)Equations (60) and (
66) will be used in our IGAC in quantum spin chains systems. Before considering suchinformation-geometric characterization of quantum energy level statistics for regular and chaotic spin chains im-mersed in an external magnetic field, we briefly review the main points of the more standard approach to thesetopics.
B. Entanglement in quantum spin chains: standard formalism
One of the most important concepts in quantum information theory is that of entanglement, an intrinsic propertyof composite quantum systems. Entanglement plays an essential role in many-body quantum phenomena, such assuperconductivity [52] and quantum phase transitions [53]. Moreover, it is an important concept in quantum compu-tation and information processing [54]. An excellent theoretical framework for investigating entanglement propertiesis offered by spin chains. Quantum spin chains belong to the most studied models of quantum statistical mechanics.However, only for a few types of models have the thermal and ground state structures have been determined. Thisis mainly a consequence of the complicated correlations that can arise among quantum states. These strong corre-lations can even be present in pure quantum states, while classical pure states can only have a trivial product statestructure. Unlike the classical case, the restrictions of pure states on the quantum spin chain to local subsystems aretypically mixed states. This type of correlation between subsystems is commonly referred to as entanglement. Thevon Neumann entropy, defined as, S von Neumann = − tr ( ρ log ρ ) , (67)is a standard measure of the nonpurity of the reduced density matrix ρ , thus it is a very useful quantity in thedescription of entanglement [55]. Several simple models of spin chains can be studied analytically and there also exist5efficient numerical techniques. There are two widely used methods of characterizing entanglement in spin chains.The first of these describes the entanglement between two spins in the chain with a quantity called concurrence [56].The other one measures entanglement of a block of spins with the rest of the chain with the von Neumann entropywhen the chain is in its ground state [57]. As a side remark, we emphasize that entanglement entropy does not referexclusively to the characterization of quantum systems in their ground states, but, more generally, it refers to anymany particles quantum dynamical states undergoing a unitary time evolution . The method used in [57] is known asthe density matrix renormalization group method (DMRG) [58]. It is based on the fact that many degrees of freedomare redundant in quantum state description; therefore, the system is adequately described by taking into accountmaximally entangled components only. von Neumann entropy is supposed to play an important role in quantifyingthe essential subspace of a reduced density matrix. The possibility of compressing such density matrices from itsfull dimension to a much smaller subspace without significant loss of information is the starting point of the DMRGanalysis. Classical complexity of quantum states can be characterized by a mixed state entanglement entropy. Forinstance, the von Neumann entropy of a block of L neighboring spins in a XX chain, describing entanglement of theblock with the rest of the chain is given by [59], S L = − tr ( ρ L log ρ L ) L →∞ ∝ log L . (68)The reduced density matrix ρ L is obtained from the ground state | Ψ g i of the chain by tracing out external degrees offreedom, ρ L = tr N − L | Ψ g i h Ψ g | , H | Ψ g i = E g | Ψ g i . (69)The Hamiltonian H in (69) is given by [57, 59], H = − N X l =1 (cid:0) s xl s xl +1 + s yl s yl +1 (cid:1) − h N X l =1 s zl , (70)where s αl ( α = x , y , z ) are the Pauli spin matrices at sites l = 1, 2,.., N of a periodic chain and h is the magneticfield. The logarithmic growth of the entanglement entropy is a general consequence of the fact that in one dimen-sional systems near quantum phase transitions, the entropy is a logarithmic function of the size of the system [60].Furthermore, there is also a time dependent version of this DMRG, known as τ -DMRG [61]. This method is usedto study the evolution of pure states, density matrices and operators. Note that the classical complexity of quantumoperators can be characterized using the operator space entanglement entropy of a density operator, not the stateentanglement entropy of a mixed state. For the evolution of density matrices and operators, a superket correspondingto an operator O expanded in the computational basis of products of local operators is considered. For instance, fora chain of n -qubits, a basis of 4 n Pauli operators is used, σ s ⊗ ... ⊗ σ s n − , with s j ∈ { x , y , z } and σ = I . Thekey idea of τ -DMRG is to represent any operator in a matrix product form [62], O = X s j tr (cid:0) A s ...A s n − n − (cid:1) σ s ⊗ .... ⊗ σ s n − (71)in terms of the 4 n matrices A s j j of fixed dimension D . The number of parameters in the matrix product (MPO)representation of the operator is 4 nD . The minimal D required, D ε ( τ ), is equal to the maximal rank of the reducedsuper density matrix over bipartitions of the chain. The way entropy can be computed in the spaces of operatorscan be found in [62]. The τ -DMRG method is very efficient in classical simulations of many body quantum dynamicsrequiring that the computational costs grow polynomially in time and, consequently, that the entanglement entropygrows no faster than logarithmically. However, it is known that the asymptotic behavior of computational costs andentanglement entropies of integrable and chaotic Ising spin chains are very different [17]. Here Prosen considered thequestion of time efficiency implementing an up-to-date version of the τ -DMRG for a family of Ising spin chainsin arbitrary oriented magnetic field, which undergoes a transition from integrable (transverse Ising) to nonintegrablechaotic regime as the magnetic field is varied. An integrable (regular) Ising chain in a general homogeneous transversemagnetic field is defined through the Hamiltonian H (regular) ≡ H (0, 2), where H ( h x , h y ) = n − X j =0 σ xj σ xj +1 + n − X j =0 (cid:0) h x σ xj + h y σ yj (cid:1) . (72)In this case, the computational cost shows a polynomial growth in time, D (regular) ε ( τ ) τ →∞ ∝ τ , while the entanglemententropy is characterized by logarithmic growth, S (regular) τ →∞ ∝ c log τ + c ′ . (73)6The constant c depends exclusively on the value of the fixed transverse magnetic field intensity B ⊥ , while c ′ dependson B ⊥ and on the choice of the initial local operators of finite index used to calculate the operator space entanglemententropy. Instead, a quantum chaotic Ising chain in a general homogeneous tilted magnetic field is defined throughthe Hamiltonian H (chaotic) ≡ H (1, 1), where H is defined in (72). In this case, the computational cost shows anexponential growth in time, D (chaotic) ε ( τ ) τ →∞ ∝ exp ( K q τ ) while the entanglement entropy is characterized by lineargrowth, S (chaotic) = τ →∞ ∝ K q τ . (74)The quantity K q is a constant, asymptotically independent of the number of indexes of the initial local operators usedto calculate the operator space entropy, that depends only on the Hamiltonian evolution and not on the details of theinitial state observable or error measures, and can be interpreted as a kind of quantum dynamical entropy.It is well known the quantum description of chaos is characterized by a radical change in the statistics of quantumenergy levels [63]. The transition to chaos in the classical case is associated with a drastic change in the statisticsof the nearest-neighbor spacings of quantum energy levels. In the regular regime the distribution agrees with thePoisson statistics while in the chaotic regime the Wigner-Dyson distribution works very well. Uncorrelated energylevels are characteristic of quantum systems corresponding to a classically regular motion while a level repulsion (asuppression of small energy level spacing) is typical for systems which are classically chaotic. A standard quantumexample is provided by the study of energy level statistics of an Hydrogen atom in a strong magnetic field. It is knownthat level spacing distribution (LSD) is a standard indicator of quantum chaos [64]. It displays characteristic levelrepulsion for strongly nonintegrable quantum systems, whereas for integrable systems there is no repulsion due toexistence of conservation laws and quantum numbers. In [17], the authors calculate the LSD of the spectra of H (regular) and H (chaotic) . They find that for H (regular) , the nearest neighbor LSD is described by a Poisson distribution. For H (chaotic) , they find the nearest neighbor LSD is described by a Wigner-Dyson distribution. Therefore, they concludethat H (regular) and H (chaotic) indeed represent generic regular and quantum chaotic systems, respectively.In the next paragraph, we will encode the relevant information about the spin-chain in a suitable composite-probability distribution taking into account the quantum spin chains and the configurations of the external magneticfield in which they are immersed. VI. AN INFORMATION GEOMETRIC MODEL OF REGULAR AND CHAOTIC QUANTUM SPINCHAINSA. Integrable Statistical Model: Poisson coupled to an Exponential Bath
Recall that in the ME method [7], the selection of relevant variables is made on the basis of intuition guided byexperiment; it is essentially a matter of trial and error. The variables should include those that can be controlledor experimentally observed, but there are cases where others must also be considered. Our objective here is tochoose the relevant microvariables of the system and select the relevant information concerning each one of them. Inthe integrable case, the Hamiltonian H (regular) describes an antiferromagnetic Ising chain immersed in a transversehomogeneous magnetic field ~B transverse = B ⊥ ˆ B ⊥ with the level spacing distribution of its spectrum given by thePoisson distribution p (Poisson)A ( x A | µ A ) = 1 µ A exp (cid:18) − x A µ A (cid:19) , (75)where the microvariable x A is the spacing of the energy levels and the macrovariable µ A is the average spacing. Thechain is immersed in the transverse magnetic field which has just one component B ⊥ in the Hamiltonian H (regular) .We translate this piece of information in our IGA formalism, coupling the probability (75) to an exponential bath p (exponential)B ( x B | µ B ) given by p (exponential)B ( x B | µ B ) = 1 µ B exp (cid:18) − x B µ B (cid:19) , (76)where the microvariable x B is the intensity of the magnetic field and the macrovariable µ B is the average intensity.More correctly, x B should be the energy arising from the interaction of the magnetic field with the spin particlemagnetic moment, x B = (cid:12)(cid:12)(cid:12) − ~µ · ~B (cid:12)(cid:12)(cid:12) = |− µB cos ϕ | where ϕ is the tilt angle. For the sake of simplicity, let us set µ = 1, then in the transverse case ϕ = 0 and therefore x B = B ≡ B ⊥ . This is our best guess and we justify it by7noticing that the magnetic field intensity is indeed a relevant quantity in this experiment (see equation (73)) andits components (intensity) are quantities that are varied during the transitions from integrable to chaotic regimes.In the regular regime, we say the magnetic field intensity is set to a well-defined value h x B i = µ B . Furthermore,notice that the exponential distribution is identified by information theory as the maximum entropy distribution ifonly one piece of information (the expectation value) is known. Finally, the chosen composite probability distribution P (integrable) ( x A , x B | µ A , µ B ) encoding relevant information about the system is given by, P (integrable) ( x A , x B | µ A , µ B ) = p (Poisson)A ( x A | µ A ) p (exponential)B ( x B | µ B )= 1 µ A µ B exp (cid:20) − (cid:18) x A µ A + x B µ B (cid:19)(cid:21) . (77)Again, we point out that our probability (77) is our best guess and, of course, must be consistent with numericalsimulations and experimental data in order to have some merit. We point out that equation (77) is not fully justifiedfrom a theoretical point of view, a situation that occurs due to the lack of a systematic way to select the relevantmicrovariables of the system (and to choose the appropriate information about such microvariables). Let us denote M (integrable) S the two-dimensional curved statistical manifold underlying our information geometrodynamics. The lineelement (cid:0) ds (cid:1) integrable on M (integrable) S is given by, (cid:0) ds (cid:1) integrable = 1 µ dµ + 1 µ dµ . (78)Applying our IGAC to the line element in (78) and following the steps provided in the ED Gaussian model of SectionsII and III of this paper, we obtain polynomial growth in V integrable M s and logarithmic IGE growth, V (integrable) M s ( τ ) τ →∞ ∝ exp( c ′ IG ) τ c IG , S (integrable) M s ( τ ) τ →∞ ∝ c IG log τ + c ′ IG . (79)The quantity c IG is a constant proportional to the number of exponential probability distributions in the compositedistribution used to calculate the IGE and c ′ IG is a constant that depends on the values assumed by the statisticalmacrovariables µ A and µ B . Equations (79) may be interpreted as the information-geometric analogue of the compu-tational complexity D (regular) ε ( τ ) and the entanglement entropy S (regular) defined in standard quantum informationtheory, respectively. We cannot state they are the same since we are not fully justifying, from a theoretical standpoint,our choice of the composite probability (77). B. Chaotic Statistical Model: Wigner-Dyson coupled to a Gaussian Bath
In the chaotic case, the Hamiltonian H (chaotic) describes an antiferromagnetic Ising chain immersed in a tiltedhomogeneous magnetic field ~B tilted = B ⊥ ˆ B ⊥ + B k ˆ B k with the level spacing distribution of its spectrum given by thePoisson distribution p (Wigner-Dyson)A ( x ′ A | µ ′ A ) p (Wigner-Dyson)A ( x ′ A | µ ′ A ) = πx ′ A µ ′ exp (cid:18) − πx ′ µ ′ (cid:19) . (80)where the microvariable x ′ A is the spacing of the energy levels and the macrovariable µ ′ A is the average spacing. Thechain is immersed in the tilted magnetic vector field which has two components B ⊥ and B k in the Hamiltonian H (chaotic) . We translate this piece of information in our IGAC formalism, coupling the probability (80) to a Gaussian p (Gaussian)B ( x ′ B | µ ′ B , σ ′ B ) given by, p (Gaussian)B ( x ′ B | µ ′ B , σ ′ B ) = 1 p πσ ′ exp − ( x ′ B − µ ′ B ) σ ′ ! . (81)where the microvariable x ′ B is the intensity of the magnetic field, the macrovariable µ ′ B is the average intensity, and σ ′ B is its covariance: during the transition from the integrable to the chaotic regime, the magnetic field intensityis being varied (experimentally). It is being tilted and its two components ( B ⊥ and B k ) are being varied as well.Our best guess based on the experimental mechanism that drives the transitions between the two regimes is that8magnetic field intensity ( actually the microvariable µB cos ϕ ) is Gaussian-distributed (two macrovariables) duringthis change. In the chaotic regime, we say the magnetic field intensity is set to a well-defined value h x ′ B i = µ ′ B withcovariance σ B = rD ( x ′ B − h x ′ B i ) E . Furthermore, the Gaussian distribution is identified by information theory asthe maximum entropy distribution if only the expectation value and the variance are known. Therefore, the chosencomposite probability distribution P (chaotic) ( x ′ A , x ′ B | µ ′ A , µ ′ B , σ ′ B ) encoding relevant information about the system isgiven by, P (chaotic) ( x ′ A , x ′ B | µ ′ A , µ ′ B , σ ′ B ) = p (Wigner-Dyson)A ( x ′ A | µ ′ A ) p (Gaussian)B ( x ′ B | µ ′ B , σ ′ B )= π (cid:0) πσ ′ (cid:1) − µ ′ x ′ A exp " − πx ′ µ ′ + ( x ′ B − µ ′ B ) σ ′ ! . (82)Let us denote M (chaotic) S the three-dimensional curved statistical manifold underlying our information geometrody-namics. The line element (cid:0) ds (cid:1) chaotic on M (chaotic) S is given by, (cid:0) ds (cid:1) chaotic = 4 µ ′ dµ + 1 σ ′ dµ ′ + 2 σ ′ dσ ′ . (83)Applying our IGAC to the line element in (83) and following the steps provided in the ED Gaussian model of SectionsII and III of this paper, we obtain exponential growth in V (chaotic) M s and linear IGE growth, V (chaotic) M s ( τ ) τ →∞ ∝ C IG exp ( K IG τ ) , S (chaotic) M s ( τ ) τ →∞ ∝ K IG τ . (84)The constant C IG encodes information about the initial conditions of the statistical macrovariables parametrizingelements of M (chaotic) S . The constant K IG , K IG τ →∞ ≈ d S M s ( τ ) dτ τ →∞ ≈ lim τ →∞ (cid:20) τ log (cid:18)(cid:13)(cid:13)(cid:13)(cid:13) J M S ( τ ) J M S (0) (cid:13)(cid:13)(cid:13)(cid:13)(cid:19)(cid:21) def = λ J , (85)is the model parameter of the chaotic system and depends on the temporal evolution of the statistical macrovariables.It plays the role of the standard Lyapunov exponent of a trajectory and it is, in principle, an experimentally observablequantity. The quantity J M S ( τ ) is the Jacobi field intensity and λ J may be considered the information-geometricanalogue of the leading Lyapunov exponent in conventional Hamiltonian systems. Given an explicit expression of K IG in terms of the observables µ ′ A and µ ′ B and σ ′ B , a clear understanding of the relation between the IGE (or K IG ) and the entanglement entropy (or K q ) becomes the key point that deserves further study. Equations (84) are theinformation-geometric analogue of the computational complexity D (chaotic) ε ( τ ) and the entanglement entropy S (chaotic) defined in standard quantum information theory, respectively. This result is remarkable, but deserves a deeper analysisin order to be fully understood.One of the major limitations of our findings is the lack of a detailed account of the comparison of the theorywith experiment. This point will be one of our primary concerns in future works. However, some considerationsmay be carried out at the present stage. The experimental observables in our theoretical models are the statisticalmacrovariables characterizing the composite probability distributions. In the integrable case, where the couplingbetween a Poisson distribution and an exponential one is considered, µ A and µ B are the experimental observables.In the chaotic case, where the coupling between a Wigner-Dyson distribution and a Gaussian is considered, µ ′ A and µ ′ B and σ ′ B play the role of the experimental observables. We believe one way to test our theory may bethat of determining a numerical estimate of the leading Lyapunov exponent λ max or the Lyapunov spectrum forthe Hamiltonian systems under investigation directly from experimental data (measurement of a time series) andcompare it to our theoretical estimate for λ J [65]. However, we are aware that it may be extremely hard to evaluateLyapunov exponents numerically. Otherwise, knowing that the mean values of the positive Lyapunov exponents arerelated to the Kolmogorov-Sinai (KS) dynamical entropy, we suggest to measure the KS entropy K directly froma time signal associated to a suitable combination of our experimental observables and compare it to our indirecttheoretical estimate for K IG from the asymptotic behaviors of our statistical macrovariables [66]. We are awarethat the ground of our discussion is quite qualitative. However, we hope that with additional study, especially inclarifying the relation between the IGE and the entanglement entropy, our theoretical characterization presented inthis paper will find experimental support in the future. Therefore, the statement that our findings may be relevantto experiments verifying the existence of chaoticity and related dynamical properties on a macroscopic level in energylevel statistics in chaotic and regular quantum spin chains is purely a conjecture at this stage.9 VII. FINAL REMARKS
In this paper, we reviewed our novel information-geometrodynamical approach to chaos (IGAC) on curved statisticalmanifolds and we emphasized the usefulness of our information-geometrodynamical entropy (IGE) as an indicatorof chaoticity in a simple application. Furthermore, knowing that integrable and chaotic quantum antiferromagneticIsing chains are characterized by asymptotic logarithmic and linear growths of their operator space entanglemententropies, respectively, we applied our IGAC to present an alternative characterization of such systems. Remarkably,we have shown that in the former case the IGE exhibits asymptotic logarithmic growth while in the latter case theIGE exhibits asymptotic linear growth.It is worthwhile emphasizing the following points: the statements that spectral correlations of classically integrablesystems are well described by Poisson statistics and that quantum spectra of classically chaotic systems are universallycorrelated according to Wigner-Dyson statistics are conjectures, known as the BGS (Bohigas-Giannoni-Schmit, [68]and BTG (Berry-Tabor-Gutzwiller, [69]) conjectures, respectively. These two conjectures are very important in thestudy of quantum chaos, however their validity finds some exceptions. Several other cases may be considered. Forinstance, chaotic systems having a spectrum that does not obey a Wigner-Dyson distribution may be considered. Achaotic system can also have a spectrum following a Poisson, semi-Poisson, or other types of critical statistics [70].Moreover, integrable systems having a spectrum that does not obey a Poisson distribution may be considered as well.For instance, the Harper model would represent such a situation. Moreover, it is worthwhile pointing out that notevery chaotic system characterized by entropy-like quantities growing linearly in time has a spectrum described by aWigner-Dyson distribution. Well-known examples presenting such a situation are the cat maps [71] and the famouskicked rotator [72] where its spectrum follows a Poisson distribution in cylinder representation and a Wigner-Dysonin torus representation but the properties of entropy-like quantities are the same in both representations (at leastclassically). All these cases are not discussed in our characterization.Therefore, at present stage, because of the above considerations and because of the lack of experimental evidencein support of our theoretical construct, we can only conclude that the IGAC might find some potential applications incertain regular and chaotic dynamical systems and this remains only a conjecture. However, we hope that our workconvincingly shows that this information-geometric approach may be considered a serious effort trying to provide aunifying criterion of chaos of both classical and quantum varieties, thus deserving further research.
Acknowledgments
The authors are grateful to Prof. Ariel Caticha and Dr. Adom Giffin for useful comments. We thank an anonymousReferee for constructive criticism that lead to concrete improvement of this work. [1] L. Casetti, C. Clementi, and M. Pettini, ”Riemannian theory of Hamiltonian chaos and Lyapunov exponents”, Phys. Rev.E , 5969-5984 (1996).[2] M. Di Bari and P. Cipriani, ”Geometry and Chaos on Riemann and Finsler Manifolds”, Planet. Space Sci. , 1543 (1998).[3] T. Kawabe, ”Indicator of chaos based on the Riemannian geometric approach”, Phys. Rev. E71 , 017201 (2005); T. Kawabe,”Chaos based on Riemannian geometric approach to Abelian-Higgs dynamical system”, Phys. Rev.
E67 , 016201 (2003).[4] W. H. Zurek and J. P. Paz, ”Decoherence, Chaos, and the Second Law”, Phys. Rev. Lett. , 2508 (1994); ”QuantumChaos: a decoherent definition”, Physica D83 , 300 (1995).[5] C. M. Caves and R. Schack, ”Unpredictability, Information, and Chaos”, Complexity , 46-57 (1997); A. J. Scott, T. A.Brun, C. M. Caves, and R. Schack, ”Hypersensitivity and chaos signatures in the quantum baker’s map”, J. Phys. A39 ,13405 (2006).[6] A. Caticha, ”Entropic Dynamics”, in
Bayesian Inference and Maximum Entropy Methods in Science and Engineering , ed.by R.L. Fry, AIP Conf. Proc. , 302 (2002).[7] A. Caticha, ”Relative Entropy and Inductive Inference”,
Bayesian Inference and Maximum Entropy Methods in Scienceand Engineering , ed. by G. Erickson and Y. Zhai, AIP Conf. Proc. , 75 (2004); A. Caticha and A. Giffin, ”UpdatingProbabilities”, in
Bayesian Inference and Maximum Entropy Methods in Science and Engineering, ed. by Ali Mohammad-Djafari, AIP Conf. Proc. , 31-42 (2006); A. Caticha and R. Preuss, ”Maximum entropy and Bayesian data analysis:Entropic prior distributions”, Phys. Rev.
E70 , 046127 (2004).[8] S. Amari and H. Nagaoka,
Methods of Information Geometry , American Mathematical Society, Oxford University Press,2000; S. Amari,
Differential-Geometrical Methods in Statistics , Springer-Verlag (1985).[9] C. Cafaro, S. A. Ali and A. Giffin, ”An Application of Reversible Entropic Dynamics on Curved Statistical Manifolds”,in
Bayesian Inference and Maximum Entropy Methods in Science and Engineering, ed. by Ali Mohammad-Djafari, AIPConf. Proc. , 243-251 (2006). [10] C. Cafaro and S. A. Ali, ”Jacobi Fields on Statistical Manifolds of Negative Curvature”, Physica D234 , 70-80 (2007).[11] C. Cafaro, ”Information Geometry and Chaos on Negatively Curved Statistical Manifolds”, in
Bayesian Inference andMaximum Entropy Methods in Science and Engineering, ed. by K. Knuth, et al ., AIP Conf. Proc. , 175 (2007).[12] A. Caticha and C. Cafaro, ”From Information Geometry to Newtonian Dynamics”, in Bayesian Inference and MaximumEntropy Methods in Science and Engineering, ed. by K. Knuth, et al ., AIP Conf. Proc. , 165 (2007).[13] C. Cafaro, “Works on an Information Geometrodynamical Approach to Chaos”, DOI: 10.1016/j.chaos.2008.04.017, Chaos,Solitons & Fractals (2008).[14] C. Cafaro, “Information-Geometric Indicators of Chaos in Gaussian Models on Statistical Manifolds of Negative RicciCurvature”, DOI: 10.1007/s10773-008-9726-x, Int. J. Theor. Phys. (2008).[15] C. Cafaro, “Information Geometry, Inference Methods and Chaotic Energy Levels Statistics”, accepted for publication inMod. Phys. Lett. B (2008).[16] C. Cafaro and S. A. Ali, “Geometrodynamics of Information on Curved Statistical Manifolds and its Applications toChaos”, EJTP , 139-162 (2008).[17] T. Prosen and M. Znidaric, ”Is the efficiency of classical simulations of quantum dynamics related to integrability?”, Phys.Rev. E75 , 015202 (2007); T. Prosen and I. Pizorn, ”Operator space entanglement entropy in transverse Ising chain”, Phys.Rev.
A76 , 032316 (2007).[18] C. G. J. Jacobi, ”Vorlesungen uber Dynamik”, Reimer, Berlin (1866).[19] R.A. Fisher, ”Theory of statistical estimation”, Proc. Cambridge Philos. Soc. , 700 (1925).[20] C.R. Rao, ”Information and accuracy attainable in the estimation of statistical parameters”, Bull. Calcutta Math. Soc. , 81 (1945).[21] E. T. Jaynes, ” Probability Theory: The Logic of Science ”, Cambridge University Press (2003).[22] S. I. Goldberg, ”Curvature and Homology”, Academic Press Inc. (1962).[23] F. De Felice and J. S. Clarke, ”Relativity on Curved Manifolds”, Cambridge University Press (1990); M. P. do Carmo,
Riemannian Geometry , Birkhauser, Boston, 1992.[24] N. S. Krylov, ”Works on the Foundations of Statistical Physics”, Princeton University Press, Princeton, 1979.[25] M. Pellicott, ”Exponential Mixing for the Geodesic Flow on Hyperbolic Three-Manifolds, Journal of Statistical Physics , 667 (1992).[26] J. Jost, ”Compact Riemann Surfaces: An Introduction to Contemporary Mathematics”, Springer-Verlag (1997).[27] M. Biesiada, ”The Power of the Maupertuis-Jacobi Principle- Dreams and Reality”, Chaos, Solitons & Fractals , 869(1994).[28] M. Biesiada, ”Searching for an invariant description of chaos in general relativity”, Class. Quantum Grav. , 715 (1995).[29] C. Uggla, K. Rosquist and R. T. Jantzen, ”Geometrizing the dynamics of Bianchi cosmology”, Phys. Rev. D42 , 404 (1990).[30] V.I. Arnold,
Mathematical Methods of Classical Physics , Springer-Verlag, 1989.[31] J. M. Lee, ”Riemannian Manifolds: An Introduction to Curvature”, Springer Verlag (1997).[32] M. P. do Carmo,
Riemannian Geometry , Birkhauser, Boston, 1992.[33] H. C. Ohanian and R. Ruffini, ”
Gravitation and Spacetime ”, W.W. Norton & Company (1994).[34] F. De Felice and J. S. Clarke, ”
Relativity on curved manifolds ”, Cambridge University Press (1990).[35] C. Chicone and B. Mashhoon, ”The generalized Jacobi equation”, Class. Quantum Grav. 19 4231-4248 (2002).[36] D. E. Hodgkinson, ”A modified equation of geodesic deviation”, Gen. Rel. Grav. , 351 (1972).[37] T. Tel and M. Gruiz,”Chaotic Dynamics: An Introduction Based on Classical Mechanics”, Cambridge University Press(2006).[38] A. Wolf, ”Quantifying chaos with Lyapunov exponents”, in Chaos, ed. A. V. Holden, Princeton University Press, Princeton,pp. 273-290 (1986).[39] J. P. Eckmann and D. Ruelle, ”Ergodic theory of chaos and strange attractors”, Rev. Mod. Phys. vol. , 617-656 (1985).[40] E. T. Jaynes, ”Information theory and statistical mechanics, I”, Phys. Rev. , 620 (1957); E. T. Jaynes, ”Informationtheory and statistical mechanics, II”, Phys. Rev. , 171 (1957).[41] S. Stenholm and K. Suominen, ”Quantum Approach to Informatics”, Wiley-Interscience (2005).[42] W. H. Zurek, Phys. Today . No 10, 36 (1991); , No. 12, 81 (1993); Prog. Theor. Phys. , 281 (1993).[43] F. Benatti, ” Deterministic Chaos in Infinite Quantum Systems ”, Springer-Verlag Berlin (1993); F. Benatti, ”Classical andQuantum Entropies: Dynamics and Information”, in Entropy edited by A. Greven et. al.,
Princeton Series in AppliedMathematics (2003).[44] R. Alicki and M. Fannes, ”Defining Quantum Dynamical Entropy”, Lett. Math. Phys. , 75-82 (1994); R. Alicki and M.Fannes, ” Quantum Dynamical Systems ”, Oxford University Press (2001).[45] A. Connes et. al ., ”Dynamical Entropy of C ∗ Algebras and von Neumann Algebras”, Commun. Math. Phys. , 691-719(1987).[46] D. P. Feldman and J. P. Crutchfield, ”Measures of complexity: Why?, Phys. Lett.
A238 , 244-252 (1998); A. Manning,”Topological entropy for geodesic flows”, Annals of Mathematics , 567-573 (1979).[47] C. E. Porter, ”
Statistical Theories of Spectra: Fluctuations ”, Academic Press, New York (1965); M. L. Mehta, ”
RandomMatrices and the Statistical Theory of Energy Levels ”, Academic Press, New York (1991).[48] T. A. Brody et. al . , ”Random-matrix physics: spectrum and strength fluctuations”, Rev. Mod. Phys. , 385 (1981);T. Prosen and M. Robnik, ”Semiclassical energy level statistics in the transition region between integrability and chaos:transition from Brody-like to Berry-Robnik behavior”, J. Phys. A27 , 8059-8077 (1994); T. Prosen and M. Robnik, ”Energylevel statistics in the transition region between integrability and chaos”, J. Phys.
A26 , 2371-2387 (1993).[49] M. Tribus, ”
Rational Descriptions, Decisions and Designs ”, Pergamon Press Inc., New York (1969). [50] D. C. Brody, ”Notes on exponential families of distributions”, arXiv: cond-mat./0705.2173 (2007).[51] T. S. Biro et al., ”Chaos and Gauge Field Theory”, World Scientific Publishing Co. ,Singapore (1994).[52] M. Tinkham, ” Introduction to Superconductivity ”, Mc..Graw-Hill, New York (1996).[53] S. Sachdev, ”
Quantum Phase Transitions ”, Cambridge University Press, Cambridge (2001).[54] M. A. Nielsen and I. L. Chuang, ”
Quantum Computation and Quantum Communication ”, Cambridge University Press,Cambridge (2000).[55] C. H. Bennett et al ., ”Concentrating partial entanglement by local operations”, Phys. Rev.
A53 , 2046 (1996).[56] T. J. Osborne and M. A. Nielsen, ”Entenglement in a simple quantum phase transition”, Phys. Rev.
A66 , 032110 (2002).[57] J. P. Keating and F. Mezzadri, ”Random Matrix Theory and Entanglement in Quantum Spin Chains”, Commun. Math.Phys. , 543 (2004).[58] S. R. White, ”Density Matrix Formulation for Quantum Renormalization Groups”, Phys. Rev. Lett. , 2863 (1992).[59] V. Eisler and Z. Zimboras, ”Entanglement in the XX spin chain with an energy current”, Phys. Rev. A71 , 042318 (2005).[60] P. Calabrese and J. Cardy, ”Entanglement Entropy and Quantum Field Theory”, J. Stat. Mech. Theor. Exp. P06002(2004).[61] S. R. White and A. E. Feguin, ” Real-Time Evolution using the Density Matrix Renormalization Group”, Phys. Rev. Lett.93, 076401 (2004); G. Vidal, ” Efficient Classical Simulations of Slightly Entangled Quantum Computations”, Phys. Rev.Lett , 147902 (2003).[62] Tomaz Prosen, ”Chaos and complexity of quantum motion”, J. Phys. A40 , 7881-7918 (2007).[63] G. Casati and B. Chirikov, Quantum Chaos, Cambridge University Press (1995); M. V. Berry, ”Chaotic Behavior inDynamical Systems”, ed. G. Casati (New York, Plenum), 1985; M. Robnik and T. Prosen, ”Comment on energy levelstatistics in the mixed regimes”, arXiv: chao-dyn/9706023, (1997).[64] F. Haake, ”
Quantum Signatures of Chaos ”, Springer-Verlag, Berlin (1991) (2nd enlarged edition, 2000).[65] A. Wolf et. al . ”Determining Lyapunov Exponents form Time Series, Physica D16 , 285-317 (1985); J. Wright, ”Methodfor calculating a Lyapunov exponent”, Phys. Rev.
A29 , 2924-2927 (1984).[66] P. Grassberger and I. Procaccia, ”Estimation of the Kolmogorov entropy from a chaotic signal”, Phys. Rev.
A28 , 2591-2593(1983).[67] B. Efron, ”Defining the curvature of a statistical problem”, Annals of Statistics 3, 1189 (1975).[68] O. Bohigas et. al .,”Characterization of Chaotic Quantum Spectra and Universality of Level Fluctuation Laws”, Phys. Rev.Lett. , 1 (1984).[69] M. C. Gutzwiller, ” Chaos in Classical and Quantum Mechanics ”, Springer-Verlag, New York (1990).[70] A. M. Garcia-Garcia and J. Wang, ”Universality in quantum chaos and the one parameter scaling theory”, arXiv: 0707.3964(2007); ”Anderson Localization in Quantum Chaos: Scaling and Universality”, Acta Physica Polonica
A112 , 635-653(2007).[71] Y. Gu, ”Evidences of classical and quantum chaos in the time evolution of nonequilibrium ensembles”, Phys. Lett.
A149 ,95-100 (1990); J. P. Keating, ”Asymptotic properties of the periodic orbits of the cat maps”,
Nonlinearity , 277-307(1991); J. P. Keating, ”The cat maps: quantum mechanics and classical motion”, Nonlinearity , 309-341 (1991).[72] F. M. Izrailev, ”Simple models of quantum chaos: spectrum and eigenfunctions”, Phys. Rep.196