Characterization of Time Series Via Rényi Complexity-Entropy Curves
Max Jauregui, Luciano Zunino, Ervin K. Lenzi, Renio S. Mendes, Haroldo V. Ribeiro
CCharacterization of Time Series Via R´enyi Complexity-EntropyCurves
M. Jauregui a , L. Zunino b,c , E. K. Lenzi d , R. S. Mendes a , H. V. Ribeiro a a Departamento de F´ısica, Universidade Estadual de Maring´a, Maring´a, PR 87020-900, Brazil b Centro de Investigaciones ´Opticas (CONICET La Plata - CIC), C.C. 3, 1897 Gonnet, Argentina c Departamento de Ciencias B´asicas, Facultad de Ingenier´ıa, Universidad Nacional de La Plata (UNLP), 1900La Plata, Argentina d Departamento de F´ısica, Universidade Estadual de Ponta Grossa, Ponta Grossa, PR 84030-900, Brazil
Abstract
One of the most useful tools for distinguishing between chaotic and stochastic time seriesis the so-called complexity-entropy causality plane. This diagram involves two complexitymeasures: the Shannon entropy and the statistical complexity. Recently, this idea has beengeneralized by considering the Tsallis monoparametric generalization of the Shannon entropy,yielding complexity-entropy curves. These curves have proven to enhance the discriminationamong different time series related to stochastic and chaotic processes of numerical andexperimental nature. Here we further explore these complexity-entropy curves in the contextof the R´enyi entropy, which is another monoparametric generalization of the Shannon entropy.By combining the R´enyi entropy with the proper generalization of the statistical complexity,we associate a parametric curve (the R´enyi complexity-entropy curve) with a given timeseries. We explore this approach in a series of numerical and experimental applications,demonstrating the usefulness of this new technique for time series analysis. We show thatthe R´enyi complexity-entropy curves enable the differentiation among time series of chaotic,stochastic, and periodic nature. In particular, time series of stochastic nature are associatedwith curves displaying positive curvature in a neighborhood of their initial points, whereascurves related to chaotic phenomena have a negative curvature; finally, periodic time seriesare represented by vertical straight lines.
Keywords: time series, R´enyi entropy, complexity measures, ordinal patterns probabilities
1. Introduction
Quantifying the degree of complexity of a system is a common task when studying themost diverse complex systems. This task usually starts by constructing a time series and thenconsidering a complexity measure. Since there is an inherent difficulty in defining the conceptof complexity, researchers have employed several approaches/theories as possible complexitymeasures. A non-exhaustive list includes entropies [1], relative entropies [2], algorithmiccomplexities [3], fractal dimensions [4], Lyapunov exponents [5], and other tradicional nonlinear
Email address: [email protected] (H. V. Ribeiro)
Preprint submitted to Physica A January 18, 2018 a r X i v : . [ phy s i c s . d a t a - a n ] J a n ime series methods [6]. However, most of these approaches suffer from the drawbacks ofbeing strongly sensitive on tuning parameters, hindering the reproducibility of results.A possible way to overcome these difficulties is to employ the permutation entropy,introduced by Bandt and Pompe [7]. This complexity measure is basically the Shannonentropy of the distribution of the permutations associated with d -dimensional partitions( x k , x k +1 , . . . , x k + d − ) of a time series ( x , . . . , x m ). It is common to choose d ∈ { , , , , } in most practical applications, in such a way that the number of permutations d ! is muchsmaller than m . For this reason, the computational cost of computing the permutationentropy is usually lower than the ones related to other complexity measures. Also, the ideaof associating permutations with finite-dimensional partitions of a time series allows theapplication of this method to time series of arbitrary nature. These remarks agree withthe fact that the method of Bandt and Pompe is already widely spread over the scientificcommunity [8, 9, 10, 11, 12, 13, 14, 15, 16, 17].In some cases, the permutation entropy can distinguish among time series of regular, chaoticand stochastic behavior. However, Rosso et al. [18] have demonstrated that this complexitymeasure alone is not enough for properly performing this task. For instance, they have shownthat time series related to the logistic map at fully developed chaos and time series associatedwith power-law correlated noises can display practically the same value of permutation entropy.Mainly because of that, Rosso et al. have employed the joint use of the permutation entropyand another complexity measure, called the statistical complexity [19, 20, 21]. The statisticalcomplexity is basically the product of the permutation entropy by a distance between thedistribution of the permutations and the uniform distribution. Having the values of thepermutation entropy H and the statistical complexity C associated with a given time series,Rosso et al. have represented this series by a point ( H, C ) in a diagram of C versus H . Thisdiagram is the so-called complexity-entropy causality plane, where the term causality refer tothe fact that temporal correlations are taken into account by the Bandt and Pompe approach.In this representation space, time series of chaotic and stochastic nature are represented bypoints located in different regions, that is, noise and chaos can be distinguished by using thecomplexity-entropy causality plane.However, we have recently depicted several situations in which the values of H and C are not enough for distinguishing among time series of distinct nature [22]; for instance,the points ( H, C ) can become very close to each other for time series displaying differentperiodic and chaotic behaviors. Motivated by this fact, we have extended the causality planefor considering the Tsallis [23] monoparametric generalization of the Shannon entropy [22].The values of the parameter of the Tsallis entropy give different weights to the probabilitiesassociated with the permutations; consequently, different dynamical scales of the system areaccessed by varying the entropy parameter. In that article, we associated parametric curveswith time series based on the different values of (
H, C ) obtained by changing the Tsallisentropy parameter, a representation that we call the complexity-entropy curve. These curveshave proven to enhance the differentiation of time series of regular, chaotic and stochasticnature even in cases in which the usual complexity-entropy causality plane does not provideuseful information.On the other hand and similarly to what happens with the concept of complexity, there areseveral other entropy definitions in the context of information theory [24, 25, 26]. These differ-ent entropies allow us to explore, capture and quantify different forms of complexity, leading2s to more suitable descriptions for characterizing the most diverse complex systems addressedby physicists. Here we further explore this idea by considering the R´enyi entropy [27] in placeof the Tsallis entropy [23]. The R´enyi entropy is also a monoparametric generalization of theShannon entropy, which has been employed in several contexts such as medical/diagnosticsapplications [28], time-frequency analysis [29], quantum entanglement measures [30, 31], andimage thresholding [32]. Therefore, in analogy to the Tsallis entropy case, we shall associatea parametric curve with a given time series (the
R´enyi complexity-entropy curve ), and byexploring some properties of this curve, we can characterize the time series under study.Among other findings, we show that the curvature of these curves identifies whether a timeseries is of a stochastic or a chaotic nature, and that periodic time series are represented byvertical straight lines.The organization of the article is as follows. Section 2 provides the definitions of the R´enyientropy and the R´enyi statistical complexity. Section 3 gives a brief description of the methodof Bandt and Pompe for defining the ordinal probabilities from a given time series. We alsowork out a list of general properties of the R´enyi complexity-entropy curves. In Section 4, weanalyze several time series of chaotic and stochastic nature, obtained by numerical proceduresor by experimental measurements. Finally, we conclude in Section 5.
2. A definition of a statistical complexity based on the R´enyi entropy
The R´enyi entropy of a discrete probability distribution p = ( p , . . . , p n ) is defined as [27] S α ( p ) = 11 − α ln n (cid:88) i =1 p αi , α > , α (cid:54) = 1 . (1)From this definition, we immediately note that S α ( p ) recovers the Shannon entropy of p when α tends to 1. We can further verify that the maximum value of the R´enyi entropy isequal to ln n (as in the Shannon entropy case), which happens when the uniform distribution u = (1 /n, . . . , /n ) is considered. This enables us to define the normalized R´enyi entropy of p as H α ( p ) = S α ( p )ln n . (2)By following Martin, Plastino and Rosso [33], we define the R´enyi statistical complexityof p as C α ( p ) = D α ( p ) H α ( p ) D ∗ α , (3)where D α ( p ) = 12( α − (cid:34) ln n (cid:88) i =1 p αi (cid:18) p i + 1 /n (cid:19) − α + ln n (cid:88) i =1 n α (cid:18) p i + 1 /n (cid:19) − α (cid:35) (4)3nd D ∗ α = 12( α −
1) ln (cid:34) ( n + 1) − α + n − n (cid:18) n + 14 n (cid:19) − α (cid:35) . (5)The quantity D α ( p ) is always non-negative [34] and can be interpreted as a distance betweenthe distribution p = ( p , . . . , p n ) and the uniform distribution u = (1 /n, . . . , /n ). In fact, D α ( p ) is a generalization of the Jensen-Shannon divergence of p and u [35]. The quantity D ∗ α represents the maximum value of D α ( p ), which is reached when p has only one non-zerocomponent. By combining this fact with Eq. (3), we have that 0 ≤ C α ( p ) ≤
3. R´enyi complexity-entropy curves in the framework of Bandt and Pompe
We start by briefly describing the method of Bandt and Pompe [7] for defining the ordinalprobabilities from a given time series ( x , . . . , x m ). Fixed an integer d > d ! (cid:28) m ,we consider the set E of all d -dimensional vectors ( x k , . . . , x k + d − ), where k = 1 , . . . , m − d + 1.We next define a mapping Π from E into the set of all permutations of the set { , . . . , d − } such that Π( x k , . . . , x k + d − ) = π , where the permutation π , which can be represented as avector ( π (0) , . . . , π ( d − x k + π (0) ≤ . . . ≤ x k + π ( d − ;(ii) if x k + π ( j ) = x k + π ( j +1) , then π ( j ) < π ( j + 1).Finally, we define the probabilities p ( π ) = { v ∈ E : Π( v ) = π } m − d + 1 , (6)where the symbol , , , , , d = 2, the vectors in E are (3 , , , ,
6) and (6 , ,
5) = Π(1 ,
6) = Π(6 ,
6) = (0 ,
1) and Π(5 ,
1) = Π(6 ,
4) = (1 , p (0 ,
1) = 3 / p (1 ,
0) = 2 / x , . . . , x m ), wecalculate the normalized R´enyi entropy and the statistical complexity by using Eqs. (2)and (3) with n = d !. In this manner, for each embedding dimension d , we construct aparametric curve C α ( p ) versus H α ( p ), considering α as a real parameter that takes values inthe interval (0 , ∞ ). We call these curves as the R´enyi complexity-entropy curves by analogywith the q -complexity-entropy curves obtained using the Tsallis entropy [22]. Since H α ( p ) isa monotonically non-increasing function of α [34, 36], R´enyi complexity-entropy curves arenever closed, i.e. , they do not form loops, in contrast with q -complexity-entropy curves, whichare likely to form loops for time series related to stochastic processes.We can prove the following general properties of the R´enyi complexity-entropy curvesassociated with an arbitrary time series in the framework of Bandt and Pompe, consideringan embedding dimension d (see Appendix A for details):(i) If only one permutation occurs — for instance, if the time series is strictly monotonic —the R´enyi complexity-entropy curve reduces to the single point (0 , , α ↓ α tends to zero from the right),and ends at the point ( h f , c f ), corresponding to α → ∞ . From h f , we obtain the valueof the maximum component p M of the probability distribution by using the simplerelation p M = ( d !) − h f . The ending point ( h f , c f ) also gives us the value of the minimumcomponent p m of the probability distribution by the relation p m = 4 p M d ! p M + 1 (cid:18) d ! + 14 d ! (cid:19) c f /h f − d ! . (7)(iii) If r permutations occur, with 1 < r < d !, then the R´enyi complexity-entropy curvebegins at a point ( h i , c i ) and ends at a point ( h f , c f ). The starting point gives us thevalue of r by means of the simple relation r = ( d !) h i . From the ending point, we obtainthe value of the maximum component p M of the probability distribution by using therelation p M = ( d !) − h f . In this case, the minimum component of p is clearly zero.An interesting consequence of item (iii) happens when the r permutations that actuallyoccur have the same probability, namely 1 /r . In this case, we have that p M = 1 /r , and from r = ( d !) h i and p M = ( d !) − h f , we obtain that h f = h i = ln r/ ln d !. Thus, the correspondingR´enyi complexity-entropy curve is a vertical straight line — we can verify straightforwardlythat the statistical complexity still depends on the parameter α in this case. This situationcan arise from the analysis of a time series displaying a periodic behavior.
4. Characterization of time series via R´enyi complexity-entropy curves
In this section, we explore the R´enyi complexity-entropy curves associated with severaltime series using the procedure described in the previous section. In our analysis, we considertime series obtained by numerical procedures (such as chaotic maps) and experimentalmeasurements (such as fluctuations of crude oil prices).
A stochastic process ( B H t ) t ≥ , characterized by a parameter H ∈ (0 , E ( B H t B H s ) = 12 ( t H + s H − | t − s | H ) . (8)The parameter H is usually called the Hurst parameter or the Hurst exponent. If H = 1 / B H t − B H s and B H t − B H s , with s < t < s < t , are independent and theusual Brownian motion is recovered. If H > / H < / / < H <
1, the fractional Brownian motion exhibits long-range correlations, in the sensethat [39] ∞ (cid:88) n =1 | E (( B H n +1 − B H n )( B H − B H )) | = ∞ . (9)5e numerically generate time series of length 2 from fractional Brownian motionswith different Hurst exponents using Hosking’s procedure [40]. Figure 1 shows the R´enyicomplexity-entropy curves related to fractional Brownian motions with Hurst exponents H ∈ { . , . . . , . } . In the left panels, an embedding dimension d = 3 has been consideredwhereas d = 4 has been used for the right panels. Each colored curve in Fig. 1 representsthe mean R´enyi complexity-entropy curve over 100 realizations associated with a given Hurstparameter, and the shaded areas are obtained by considering two standard deviations in boththe H α and C α values. The dashed lines were obtained using the analytical expressions for theprobabilities of the permutations when d = 3 and d = 4, obtained by Bandt and Shiha [41](see also Appendix B of Ref. [22]). Embedding dimension d = 3 = 0.1= 0.2= 0.3= 0.4 Embedding dimension d = 4 = 0.1= 0.2= 0.3= 0.4 = 0.5= 0.6= 0.7= 0.8 = 0.5= 0.6= 0.7= 0.8 Entropy H S t a t i s t i c a l c o m p l e x i t y C Figure 1: The R´enyi complexity-entropy curves related to fractional Brownian motions with Hurst exponents
H ∈ { . , . . . , . } . We have considered the embedding dimensions d = 3 for the left panels, and d = 4 for theright panels. The colored lines represent the mean R´enyi complexity-entropy curves over 100 realizations,and the shaded areas are obtained by considering two standard deviations in both the H α and C α values.The dashed lines are the R´enyi complexity-entropy curves obtained analytically using the exact probabilitiesgiven in Ref. [41]. The markers + and ◦ indicate the beginning and the end of each curve respectively. The (cid:73) markers indicate the ordered pair associated with the usual permutation entropy and statistical complexityvalues obtained when α tends to 1. We note immediately from Fig. 1 that all R´enyi complexity-entropy curves start at thepoint (1 , d = 3 and d = 4.We further observe a good agreement between the numerically obtained curves and the exactresults. Another common characteristic of these curves is that they have a positive curvaturein a neighborhood of the starting point (1 , dC α /dH α is amonotonic increasing function of α in a neighborhood of α = 0, as shown in Fig. 2. Putting6 ∞ = lim α →∞ H α , we observe from Fig. 1 that H ∞ is a monotonically decreasing functionof the Hurst parameter. Moreover, the points associated with these values (indicated by ◦ markers) are more distant from each other than the ones related to the usual Shannonentropy (indicated by (cid:73) markers). Thus, these points associated with H ∞ enable a betterdifferentiation of time series related to fractional Brownian motions with different Hurstexponents, as it was also discussed in Ref. [42]. Embedding dimension d = 3 = 0.1= 0.2= 0.3= 0.4= 0.5= 0.6= 0.7= 0.8 Embedding dimension d = 4 Renyi parameter d C d H Figure 2: Representation of the derivative of the statistical complexity C α with respect to the entropy H α as a function of the R´enyi parameter α for the fractional Brownian motions with Hurst parameter H ∈ { . , . . . , . } , the same considered in Fig. 1. In addition to time series obtained from stochastic processes, we also investigate time seriesassociated with chaotic phenomena. In particular, we construct time series of length 10 + 2 by the iteration of eight different chaotic maps, namely: Burgers, cubic, Gingerbreadman,Henon, logistic, Ricker, sine, and Tinkerbell maps, at fully developed chaos. Details abouteach map are provided in Appendix C of Ref. [22]. For the two-dimensional chaotic maps ofBurgers, Gingerbreadman, Henon and Tinkerbell maps, we have considered the time series ofthe square of the sum of the two components. To avoid any possible transient behavior, wehave removed the first 10 iterations in all simulations.Figure 3 shows the R´enyi complexity-entropy curves associated with the eight chaoticmaps mentioned in the previous paragraph. For each map, we have considered the embeddingdimensions d ∈ { , , , } . In contrast with the curves associated with fractional Brownianmotions, the R´enyi complexity-entropy curves related to chaotic maps begin at points differentfrom (1 , d >
4. This feature indicates that thereare permutations that do not occur. We further note that, at least for d = 6, the R´enyicomplexity-entropy curves have negative curvature in a neighborhood of their initial point, i.e. ,the derivative dC α /dH α is a decreasing function of the R´enyi parameter α in the neighborhoodof α = 0, as shown in Fig. 4. 7 .2 0.4 0.6 0.8 1.00.00.10.20.30.40.5 Burgers
Logistic d = 3 d = 4 d = 5 d = 6 Cubic
Ricker
Gingerbreadman
Sin
Henon
Tinkerbell
Entropy H S t a t i s t i c a l c o m p l e x i t y C Figure 3: R´enyi complexity-entropy curves of various chaotic maps at fully developed chaos. The embeddingdimensions d ∈ { , , , } were considered for each chaotic map. The markers + and ◦ indicate the beginningand the end of each curve respectively. The (cid:73) markers indicate the ordered pair associated with the usualpermutation entropy and statistical complexity obtained when α tends to 1. .00 0.25 0.50 0.75 1.00024 Burgers
Logistic d = 3 d = 4 d = 5 d = 6 Cubic
Ricker
Gingerbreadman
Sin
Henon
Tinkerbell
Renyi parameter d C d H Figure 4: Representation of the derivative of the statistical complexity C α with respect to the entropy H α asa function of the R´enyi parameter α for the chaotic maps considered in Fig. 3. .3. The logistic map Among the chaotic maps considered in the previous subsection, the logistic map isundoubtedly one of the most famous. This map is defined by the recurrence formula y k +1 = ay k (1 − y k ) , (10)where a is a real parameter, whose values of interest are in the interval [0 , a changes the behavior of the logistic map. For instance, this map exhibitssimple periodic behavior for a = 3 .
05, stable cycles of period 4 for a = 3 . a = 3 .
55, and chaos for most values of a > . . . . and for a = 4 (fully developedchaos).Figure 5a shows the R´enyi complexity-entropy curves associated with the logistic map for a ∈ { . , . , . , . , } and embedding dimension d = 4. We note that all curves beginat points different from (1 , a = 3 .
593 and a = 4, which correspond to two chaotic regimesof the logistic map, show negative curvature in a neighborhood of their starting points. Onthe other hand, for the other values of a , for which the logistic map has a periodic behavior,the R´enyi complexity-entropy curves are almost vertical lines. This indicates that, for each a ∈ { . , . , . } , the permutations that actually occur have the same probability. Thiscan be verified directly using the representation of the corresponding time series. Moreover,the R´enyi complexity-entropy curves for a = 3 . a = 3 .
55 coincide, suggesting that bothtime series have the same probability distribution of the permutations. In fact, a detailedinspection of these time series reveals that the permutations that actually occur are the samefor both of them. Another curious fact from Fig. 5a is that the point associated with the usualpermutation entropy and statistical complexity for a = 3 .
55 and a = 3 .
593 almost coincideeven when these two values correspond to completely different regimes of the logistic map.The logistic map specially attracts our interest because the probabilities of the permutationsobtained from its time series within the Bandt and Pompe approach can be found analyticallywhen a = 4 and d = 3. In fact, Amig´o et al. [43, 44, 45, 46] have shown that the list( y k , y k +1 , y k +2 ) corresponds to the permutation (0 , ,
2) if 0 < y k < /
4. In the same way,the permutation (0 , ,
1) occurs if 1 / < y k < −√ , (2 , ,
1) if −√ < y k < /
4, (1 , , / < y k < √ , (1 , ,
0) if −√ < y k <
1, and no list ( y k , y k +1 , y k +2 ) corresponds tothe permutation (1 , , a = 4 has an invariant distribution concentrated in the interval (0 ,
1) with density π − y − / (1 − y ) − / [47], we obtain the probabilities of the permutations by integrating thisdensity in each interval for which each permutation occurs. Thus, we obtain the distribution p = (1 / , / , / , / , / ,
0) (following the order of appearance of the permutations),from which we analytically compute H α ( p ) and C α ( p ). Figure 5b shows a comparison betweenthe numerical and the analytical R´enyi complexity-entropy curves associated with the logisticmap for a = 4 and embedding dimension d = 3, where we note that both curves practicallycoincide. The analyses performed in the previous subsections suggest that the R´enyi complexity-entropy curves enable the distinction of time series associated with stochastic processes, chaotic10 .0 0.2 0.4 0.6 0.80.20.30.40.5 (a) a = 3.05 a = 3.5 a = 3.55 a = 3.593 a = 4.0 (b)Entropy H S t a t i s t i c a l c o m p l e x i t y C Figure 5: (a) The R´enyi complexity-entropy curves associated with the logistic map for typical values of theparameter a and embedding dimension d = 4. The + and ◦ markers represent the beginning and end of thecurves respectively. The (cid:73) marker identifies the point associated with the usual permutation entropy andstatistical complexity, obtained when α tends to 1. (b) The R´enyi complexity-entropy curve associated withthe logistic map for a = 4 and d = 3. The dotted curve was analytically obtained using the exact values ofthe probabilities of the permutations. The meanings of the +, ◦ and (cid:73) markers are the same as in panel (a). phenomena, and periodic behaviors. More precisely, a positive curvature of this curve in aneighborhood of its starting point indicates that the associated time series is of a stochasticnature, whereas a negative curvature, at least for large embedding dimensions, expresses thatthe time series may be of a chaotic nature. Also, a vertical R´enyi complexity-entropy curveindicates that the time series has a periodic behavior. To further explore these affirmations, wenow consider time series obtained by experimental measurements. In particular, we considerthe time series generated from the intensity pulsations of a laser [48], which is of a chaoticnature, and the one associated with the fluctuations of the crude oil price, which is of astochastic nature.The time series associated with the chaotic intensity pulsations of a laser has 9093 termsand is freely available on the Internet [49]. The time series related to the crude oil prices refersto the daily closing spot price of the West Texas Intermediate from January 2, 1986, to July10, 2012. This time series has 7788 terms and can also be obtained freely on the Internet [50].The R´enyi complexity-entropy curves for both time series and their derivatives as functions ofthe R´enyi parameter α are represented in Fig. 6 for embedding dimensions d ∈ { , , , } .We note that the R´enyi complexity-entropy curves associated with the intensity pulsations ofa laser start at points different from (1 ,
0) for embedding dimensions d >
3, indicating thelack of some permutations. Moreover, the curvature of these curves in a neighborhood oftheir initial point is negative for embedding dimensions d ≥
5, in agreement with the chaotic11ature of the corresponding time series. The R´enyi complexity-entropies associated withthe fluctuations of the crude oil price begin at the point (1 , d = 6. A possible justification for this fact is that the length of the series is notgreat enough — note that 6! = 720, which is not too much less than 7788. On the otherhand, the curvature of these curves in a neighborhood of their starting points is positive, inagreement with the stochastic nature of the corresponding time series.In addition to time series with a well-defined nature, we also analyze the time series of themonthly smoothed sunspots index, for which there is still no consensus about its stochastic orchaotic nature [51, 52, 53, 54, 55, 56, 57]. This time series is freely available on the Internet [58]and was generated by analyzing the 13-month smoothed monthly sunspot index from 1974 to2016, yielding a time series with 3202 terms. The R´enyi complexity-entropy-curves associatedwith this time series and their derivatives as functions of the R´enyi parameter α for embeddingdimensions d ∈ { , , , } are represented in Fig. 6. We note that the R´enyi complexity-entropy curves start at the point (1 , d = 5 and d = 6.Moreover, the curvature of these curves are positive for embedding dimensions d ∈ { , , } and is negative for d = 6. For these reasons, we are not in a good position to state anythingabout the nature of the corresponding time series mainly because of its small length.
5. Conclusions
We have considered a generalized definition of the statistical complexity based on amonoparametric generalization of the Shannon entropy, namely the R´enyi entropy. We haveused this generalized statistical complexity C α in combination with the normalized R´enyientropy H α to construct a parametric curve C α versus H α , taking the R´enyi parameter α > q -complexity-entropy curves [22], which are based on theTsallis entropy. Thus, R´enyi complexity-entropy curves may give complementary informationabout time series that are not properly taken into account by the q -complexity-entropy curves. Acknowledgments
M.J. thanks the financial support of CNPq under Grant 150577/2017-6. H.V.R. acknowl-edges the financial support of CNPq under Grant 440650/2014-3. L.Z. acknowledges ConsejoNacional de Investigaciones Cient´ıficas y T´ecnicas (CONICET), Argentina, for the financialsupport. E.K.L. thanks the financial support of the CNPq under Grant No. 303642/2014-9.12 .4 0.6 0.8 1.0
Entropy H S t a t i s t i c a l c o m p l e x i t y C Renyi parameter d C d H d = 3 d = 4 d = 5 d = 6 Entropy H S t a t i s t i c a l c o m p l e x i t y C Renyi parameter d C d H Entropy H S t a t i s t i c a l c o m p l e x i t y C Renyi parameter d C d H Chaotic intensity pulsations of a laserFluctuations of the crude oil priceMonthly smoothed sunspot index
Figure 6: (Left panels) R´enyi complexity-entropy curves for time series associated with the chaotic intensitypulsations of a laser, the fluctuations of the crude oil price, and the monthly smoothed sunspot index. Theembedding dimensions d ∈ { , , , } were considered for each time series. The + and ◦ markers denote thebeginning and the end of each curve. The (cid:73) markers represent the ordered pairs corresponding to the usualpermutation entropy and statistical complexity, obtained when α tends to 1. (Right panels) Representationof the derivative of C α with respect to H α as a function of the R´enyi parameter α for the time series andembedding dimensions considered in the left panels. ppendix A. Limit values of H α and C α Theorem.
Given an arbitrary discrete probability distribution p = ( p , . . . , p n ) , let r be thenumber of non-zero components of p . If r = 1 , then (i) H α ( p ) = 0 and C α ( p ) = 0 ;if r > and p M , p m ∈ [0 , denote the maximum and minimum components of p respectively,then (ii) lim α ↓ H α ( p ) = ln r/ ln n ; (iii) lim α →∞ H α ( p ) = − ln p M / ln n ; (iv) lim α ↓ C α ( p ) = ln r ln n ln( n + r ) − ln 2 n ln( n +1) − ln 2 n ; (v) lim α →∞ C α ( p ) = ln[( np M +1)( np m +1)] − ln 4 np M ln 4 n − ln( n +1) ln p M ln n .Proof. Items (i) and (ii) follow directly from the definitions of H α and C α .(iii) We have p αM ≤ (cid:80) ni =1 p αi ≤ rp αM and, consequently, ln p αM ≤ ln( (cid:80) ni =1 p αi ) ≤ ln( rp αM ).Hence, for α > α − α ln p M ln n ≥ − α ) ln n ln n (cid:88) i =1 p αi ≥ α − α ln p M ln n + ln r (1 − α ) ln n . (A.1)Letting α increase without bound, we obtain H α ( p ) → − ln p M / ln n .(iv) We verify immediately that lim α ↓ D ∗ α = −
12 ln n + 12 n (A.2)and lim α ↓ D α ( p ) = −
12 ln (cid:18)
12 + rn (cid:19) . (A.3)Hence, lim α ↓ C α ( p ) = ln r ln n ln( n + r ) − ln 2 n ln( n + 1) − ln 2 n . (A.4)(v) We have D α ( p ) D ∗ α = ln[ (cid:80) ni =1 p i ( p i +1 /n p i ) − α (cid:80) nj =1 1 n ( p j +1 /n /n ) − α ]( α −
1) ln nn +1 + ln ( n +1) − α + n − n . (A.5)To find the limit of Eq. (A.5) as α → ∞ , we start by obtaining convenient upper and lowerbounds of D α ( p ) /D ∗ α . With this in mind, we note that p i + 1 /np i = 1 + 1 np i ≥ np M (A.6)14nd, consequently, for α > n (cid:88) i =1 p i (cid:18) p i + 1 /n p i (cid:19) − α ≤ (cid:18) p M + 1 /n p M (cid:19) − α . (A.7)On the other hand, using the fact that p i + 1 /n /n = np i + 1 ≥ np m + 1 , (A.8)we obtain that n (cid:88) i =1 n (cid:18) p i + 1 /n /n (cid:19) − α ≤ (cid:18) np m + 12 (cid:19) − α . (A.9)Since the denominator in Eq. (A.5) is positive for sufficiently large α , using Eqs. (A.7)and (A.9) in Eq. (A.5), we have D α ( p ) D ∗ α ≤ ( α −
1) ln np M ( np M +1)( np m +1) ( α −
1) ln nn +1 + ln ( n +1) − α + n − n (A.10)for sufficiently large α . To obtain a lower bound, we note immediately that p M (cid:18) p M + 1 /n p M (cid:19) − α ≤ n (cid:88) i =1 p i (cid:18) p i + 1 /n p i (cid:19) − α (A.11)and that 1 n (cid:18) p m + 1 /n /n (cid:19) − α ≤ n (cid:88) i =1 n (cid:18) p i + 1 /n /n (cid:19) − α . (A.12)Hence, D α ( p ) D ∗ α ≥ ( α −
1) ln np M ( np M +1)( np m +1) + ln p M n ( α −
1) ln nn +1 + ln ( n +1) − α + n − n (A.13)for sufficiently large values of α . Then, it follows from Eqs. (A.10) and (A.13) thatlim α →∞ D α ( p ) D ∗ α = ln 4 np M − ln[( np M + 1)( np m + 1)]ln 4 n − ln( n + 1) . (A.14)The product of this result with the one obtained in item (iii) is equal to the limit of C α ( p ) as α increases without bound. (cid:4) References [1] C. E. Shannon, A mathematical theory of communication, Bell Syst.Tech. J. 27 (1948)379. doi:10.1002/j.1538-7305.1948.tb01338.x .[2] S. Kullback, R. A. Leibler, On information and sufficiency, Ann. Math. Statist. 22 (1951)79. doi:10.1214/aoms/1177729694 . 153] A. N. Kolmogorov, Three approaches to the quantitative definition of information, Probl.Inf. Transm. 1 (1965) 3. doi:10.1080/00207166808803030 .[4] B. B. Mandelbrot, The fractal geometry of nature, Freeman, San Francisco, 1982.[5] A. M. Lyapunov, The general problem of the stability of motion, Taylor-Francis, London,1992.[6] M. Perc, Nonlinear time series analysis of the human electrocardiogram, EuropeanJournal of Physics 26 (5) (2005) 757. doi:10.1088/0143-0807/26/5/008 .[7] C. Bandt, B. Pompe, Permutation entropy: a natural complexity measure for time series,Phys. Rev. Lett. 88 (2002) 174102. doi:10.1103/PhysRevLett.88.174102 .[8] C. Bian, C. Qin, Q. D. Y. Ma, Q. Shen, Modified permutation-entropy analysis of heart-beat dynamics, Phys. Rev. E 85 (2012) 021906. doi:10.1103/PhysRevE.85.021906 .[9] H. V. Ribeiro, L. Zunino, R. S. Mendes, E. K. Lenzi, Complexity–entropy causalityplane: A useful approach for distinguishing songs, Physica A 391 (7) (2012) 2421–2428. doi:10.1016/j.physa.2011.12.009 .[10] H. V. Ribeiro, L. Zunino, E. K. Lenzi, P. A. Santoro, R. S. Mendes, Complexity-entropycausality plane as a complexity measure for two-dimensional patterns, PLoS ONE 7(2012) e40689. doi:10.1371/journal.pone.0040689 .[11] A. Aragoneses, N. Rubido, J. Tiana-Alsina, M. C. Torrent, C. Masoller, Distinguishingsignatures of determinism and stochasticity in spiking complex systems, Sci. Rep. 3(2013) 1778. doi:10.1038/srep01778 .[12] Q. Li, F. Zuntao, Permutation entropy and statistical complexity quantifier of non-stationarity effect in the vertical velocity records, Phys. Rev. E 89 (2014) 012905. doi:10.1103/PhysRevE.89.012905 .[13] P. J. Weck, D. A. Schaffner, M. R. Brown, R. T. Wicks, Permutation entropy andstatistical complexity analysis of turbulence in laboratory plasmas and the solar wind,Phys. Rev. E 91 (2015) 023101. doi:10.1103/PhysRevE.91.023101 .[14] Y.-G. Yang, Q.-X. Pan, S.-J. Sun, P. Xu, Novel image encryption based on quantumwalks, Sci. Rep. 5 (2015) 7784. doi:10.1038/srep07784 .[15] A. Aragoneses, L. Carpi, N. Tarasov, D. V. Churkin, M. C. Torrent, C. Masoller,S. K. Turitsyn, Unveiling temporal correlations characteristic of a phase transitionin the output intensity of a fiber laser, Phys. Rev. Lett. 116 (2016) 033902. doi:10.1103/PhysRevLett.116.033902 .[16] H. Lin, A. Khurram, Y. Hong, Time-delay signatures in multi-transverse mode VCSELssubject to double-cavity polarization-rotated optical feedback, Optics Communications377 (2016) 128 – 138. doi:10.1016/j.optcom.2016.05.044 .1617] L. Zunino, H. V. Ribeiro, Discriminating image textures with the multiscale two-dimensional complexity-entropy causality plane, Chaos, Solitons & Fractals 91 (2016)679 – 688. doi:10.1016/j.chaos.2016.09.005 .[18] O. A. Rosso, H. A. Larrondo, M. T. Martin, A. Plastino, M. A. Fuentes, Distinguishingnoise from chaos, Phys. Rev. Lett. 99 (2007) 154102. doi:10.1103/PhysRevLett.99.154102 .[19] R. L´opez-Ruiz, H. L. Mancini, X. Calbet, A statistical measure of complexity, Phys. Lett.A 209 (1995) 321. doi:10.1016/0375-9601(95)00867-5 .[20] C. Anteneodo, A. R. Plastino, Some features of the L´opez-Ruiz-Mancini-Calbet (LMC)statistical measure of complexity, Phys. Lett. A 223 (5) (1996) 348–354. doi:10.1016/S0375-9601(96)00756-6 .[21] P. W. Lamberti, M. T. Martin, A. Plastino, O. A. Rosso, Intensive entropic non-trivialitymeasure, Physica A 334 (2004) 119. doi:10.1016/j.physa.2003.11.005 .[22] H. V. Ribeiro, M. Jauregui, L. Zunino, E. K. Lenzi, Characterizing time series viacomplexity-entropy curves, Phys. Rev. E 95 (2017) 062106. doi:10.1103/PhysRevE.95.062106 .[23] C. Tsallis, Possible generalization of the Boltzmann-Gibbs statistics, J. Stat. Phys. 52(1988) 479. doi:10.1007/BF01016429 .[24] R. M. Gray, Entropy and Information Theory, Springer Science & Business Media, 2011.[25] T. M. Cover, J. A. Thomas, Elements of Information Theory, John Wiley & Sons, 2012.[26] C. Beck, Generalised information and entropy measures in physics, Contemporary Physics50 (4) (2009) 495–510. doi:10.1080/00107510902823517 .[27] A. R´enyi, On measures of entropy and information, in: Proceedings of the Fourth BerkeleySymposium on Mathematical Statistics and Probability, Volume 1: Contributions to theTheory of Statistics, University of California Press, Berkeley, Calif., 1961, pp. 547–561.URL http://projecteuclid.org/euclid.bsmsp/1200512181 [28] N. Kannathal, M. L. Choo, U. R. Acharya, P. Sadasivan, Entropies for detection ofepilepsy in EEG, Computer Methods and Programs in Biomedicine 80 (3) (2005) 187–194. doi:10.1016/j.cmpb.2005.06.012 .[29] R. G. Baraniuk, P. Flandrin, A. J. E. M. Janssen, O. J. J. Michel, Measuring time-frequency information content using the R´enyi entropies, IEEE Transactions on Informa-tion Theory 47 (4) (2001) 1391–1409. doi:10.1109/18.923723 .[30] M. Portesi, A. Plastino, Generalized entropy as a measure of quantum uncertainty,Physica A 225 (3-4) (1996) 412–430. doi:10.1016/0378-4371(95)00475-0 .1731] R. Islam, R. Ma, P. M. Preiss, M. E. Tai, A. Lukin, M. Rispoli, M. Greiner, Measuringentanglement entropy in a quantum many-body system, Nature 528 (7580) (2015) 77–83. doi:10.1038/nature15750 .[32] P. K. Sahoo, G. Arora, A thresholding method based on two-dimensional Renyi’s entropy,Pattern Recognition 37 (6) (2004) 1149–1161. doi:10.1016/j.patcog.2003.10.008 .[33] M. T. Martin, A. Plastino, O. A. Rosso, Generalized statistical complexity measures:geometrical and analytical properties, Physica A 369 (2006) 439. doi:10.1016/j.physa.2005.11.053 .[34] T. van Erven, P. Harrem¨oes, R´enyi divergence and Kullback–Leibler divergence, IEEETransactions on Information Theory 60 (7) (2014) 3797–3820. doi:10.1109/TIT.2014.2320500 .[35] J. Lin, Divergence measures based on the Shannon entropy, IEEE Transactions onInformation Theory 37 (1991) 145–151. doi:10.1109/18.61115 .[36] C. Beck, F. Sch¨ogl, Thermodynamics of Chaotic Systems: An Introduction, Cam-bridge Nonlinear Science Series, Cambridge University Press, 1993. doi:10.1017/CBO9780511524585 .[37] B. B. Mandelbrot, J. W. Van Ness, Fractional Brownian motions, fractional noises andapplications, SIAM Rev. 10 (4) (1968) 422. doi:10.1137/1010093 .[38] G. Shevchenko, Fractional Brownian motion in a nutshell, International Journal of ModernPhysics: Conference Series 36 (2015) 1560002. doi:10.1142/S2010194515600022 .[39] C. Wijeratne, H. Bessaih, Fractional Brownian motion and an application to fluids, in:Stochastic Equations for Complex Systems, Springer, 2015, pp. 37–52.[40] J. R. M. Hosking, Modeling persistence in hydrological time series using fractionaldifferencing, Water Resources Research 20 (12) (1984) 1898–1908. doi:10.1029/WR020i012p01898 .[41] C. Bandt, F. Shiha, Order patterns in time series, Journal of Time Series Analysis 28 (5)(2007) 646–665. doi:10.1111/j.1467-9892.2007.00528.x .[42] L. Zunino, F. Olivares, O. A. Rosso, Permutation min-entropy: An improved quantifierfor unveiling subtle temporal correlations, EPL 109 (1) (2015) 10005. doi:10.1209/0295-5075/109/10005 .[43] J. M. Amig´o, L. Kocarev, J. Szczepanski, Order patterns and chaos, Phys. Lett. A 355 (1)(2006) 27–31. doi:10.1016/j.physleta.2006.01.093 .[44] J. M. Amig´o, S. Zambrano, M. A. F. Sanju´an, True and false forbidden patterns indeterministic and random dynamics, EPL 79 (2007) 50001. doi:10.1209/0295-5075/79/50001 . 1845] J. M. Amig´o, S. Zambrano, M. A. F. Sanju´an, Combinatorial detection of determinismin noisy time series, EPL 83 (2008) 60005. doi:10.1209/0295-5075/83/60005 .[46] J. M. Amig´o, Permutation Complexity in Dynamical Systems, Springer-Verlag, Berlin,2010), 2010.[47] M. V. Jakobson, Absolutely continuous invariant measures for one-parameter familiesof one-dimensional maps, Commun. Math. Phys. 81 (1) (1981) 39–88. doi:10.1007/BF01941800 .[48] U. Hubner, N. B. Abraham, C. O. Weiss, Dimensions and entropies of chaotic intensitypulsations in a single-mode far-infrared NH laser, Phys. Rev. A 40 (11) (1989) 6354. doi:10.1103/PhysRevA.40.6354 .[49] Time series A of the Santa Fe time series competition, https://rdrr.io/cran/TSPred/man/SantaFe.A.html , accessed: 2017-01-16.[50] U.S. Energy Information Administration, , accessed: 2017-01-16.[51] M. Carbonell, R. Oliver, J. L. Ballester, On the asymmetry of solar activity, Astron.Astrophys. 274 (1993) 497.URL http://adsabs.harvard.edu/abs/1993A%26A...274..497C [52] M. Paluˇs, D. Novotn´a, Sunspot cycle: a driven nonlinear oscillator?, Phys. Rev. Lett.83 (17) (1999) 3406. doi:10.1103/PhysRevLett.83.3406 .[53] J. Timmer, What can be inferred from surrogate data testing?, Phys. Rev. Lett. 85 (12)(2000) 2647. doi:10.1103/PhysRevLett.85.2647 .[54] M. Paluˇs, Paluˇs and Novotn´a reply, Phys. Rev. Lett. 85 (12) (2000) 2648. doi:10.1103/PhysRevLett.85.2648 .[55] P. D. Mininni, D. O. G´omez, G. B. Mindlin, Stochastic relaxation oscillator model for thesolar cycle, Phys. Rev. Lett. 85 (25) (2000) 5476. doi:10.1103/PhysRevLett.85.5476 .[56] P. D. Mininni, D. O. G´omez, Study of stochastic fluctuations in a shell dynamo, Astrophys.J. 573 (1) (2002) 454. doi:10.1086/340495 .[57] M. De Domenico, V. Latora, Fast detection of nonlinearity and nonstationarity in shortand noisy time series, EPL 91 (3) (2010) 30005. doi:10.1209/0295-5075/91/30005 .[58] Sunspot index and long-term solar observations – monthly and smoothed sunspot number(1749-07 to 2016-04),