Almost sure rates of mixing for random intermittent maps
aa r X i v : . [ m a t h . D S ] J un Almost sure rates of mixing for randomintermittent maps
Marks Ruziboev
Abstract
We consider a family F of maps with two branches and a common neutralfixed point 0 such that the order of tangency at 0 belongs to some interval [ α , α ] ⊂ ( , ) . Maps in F do not necessarily share a common Markov partition. At each stepa member of F is chosen independently with respect to the uniform distributionon [ α , α ] . We show that the construction of the random tower in Bahsoun-Bose-Ruziboev [5] with general return time can be carried out for random compositionsof such maps. Thus their general results are applicable and gives upper bounds forthe quenched decay of correlations of form n − / α + δ for any δ > In recent years there has been remarkable interest in studying statistical properties ofrandom dynamical systems induced by random compositions of different maps (seefor example [1]-[6], [10], [13], [14], [17] and references therein). In [4] i.i.d. ran-dom compositions of two Liverani-Saussol-Vaienti (LSV) maps were consideredand it was shown that the rate of decay of the annealed (averaged over all reali-sations) correlations is given by the fast dynamics. Recently the general results onquenched decay rates (i.e. decay rates for almost every realisation) for the randomcompositions of non-uniformly expanding maps were obtained in [5]. As an illustra-tion it was shown ibidem that the general results are applicable to the random mapinduced by compositions of LSV maps with parameters in [ α , α ] ⊂ ( , ) chosen Marks RuziboevDepartment of Mathematical Sciences, Loughborough University, Loughborough, Leicestershire,LE11 3TU, UK e-mail:
Dedicated to Abdulla Azamov and Leonid Bunimovich on the occasion of their 70th birthday A subclass of the so called Pomeau-Manneville maps introduced in [18], and popularised byLiverani, Saussol and Vaienti in [15]. Such systems have attracted attention of both mathematiciansand physicists (see [14] for a recent work in this area). 1 Marks Ruziboev with respect to a suitable distribution ν on [ α , α ] . In the current note we, fix theuniform distribution on [ α , α ] and consider a family of maps with common neutralfixed point. Our maps do not share a common Markov partition. We show that theconstruction of the random tower of [5] with general return time can be carried outfor the random compositions of such maps. Hence the main result of [5] is appli-cable. We obtain upper bounds for the quenched decay of correlations of the form n − / α + δ for any δ > F and state the main result of the paper (Theorem 1). In Section 3, weconstruct uniformly expanding induced random map and show that the assumptionsrequired in [5] are satisfied, i.e. we check uniform expansion, bounded distortion,decay rates for the tail of the return time and aperiodicity. Also we formulate atechnical proposition in this section which is used to obtain the tail estimates andproved in Section 4. In this section we define the main object of the current note: the random maps. Fixtwo real numbers 0 < α < α <
1. Let I = [ , ] and let F be a parametrised familyof maps T α : I → I , α ∈ [ α , α ] with the following properties.(A1) There exists a C function x : [ α , α ] → ( , ) , α x α such that T α : [ , x α ) → [ , ) and T α : [ x α , ] → [ , ] are increasing diffeomorphisms.(A2) T ′ α ( x ) > x > ε > α c α , ( x , α ) f α ( x ) suchthat f α ( ) = T α ( x ) = x + c α x + α ( + f α ( x )) for any x ∈ [ , ε ] .(A4) Every T α is C on ( , x α ] with negative Schwarzian derivative.(A5) ( x , α ) T ′′ α ( x ) and ( x , α ) T ′ α ( x ) are continuous on I × [ α , α ] .Notice that the elements of F are parametrised according to the tangency near0. Now, we describe the randomising dynamics. Let η be the normalised Lebesguemeasure on [ α , α ] . Let Ω = [ α , α ] Z and P = η Z . Then the shift map σ : Ω → Ω preservers P , i.e. σ ∗ P = P . For ω ∈ Ω , ω = . . . ω − , ω , ω , . . . let α ( ω ) = ω ∈ [ α , α ] . The random map is formed by random compositions of maps T α ( ω ) : I → I from F , where the compositions are defined as T n ω ( x ) = T α ( σ n − ( ω )) ◦ · · · ◦ T α ( ω ) ( x ) .Below we use more shorter notation T n ω = T ω n − ◦ · · · ◦ T ω ( x ) . We are interested instudying the statistical properties of equivariant families of measures i.e. familiesof measures { µ ω } ω ∈ Ω such that ( T ω ) ∗ µ ω = µ σω . Let µ be a probability measureon I × Ω such that µ ( A ) = R Ω µ ω ( A ) d P ( ω ) for A ⊂ I × Ω . We say that the system { f ω , µ ω } ω ∈ Ω (or simply { µ ω } ω ) is mixing if for all ϕ , ψ ∈ L ( µ ) ,lim n → ∞ (cid:12)(cid:12)(cid:12)(cid:12) Z Ω Z ϕ σ n ω ◦ f n ω · ψ ω d µ ω d P − Z Ω Z ϕ ω d µ ω d P Z Ω Z ψ ω d µ ω d P (cid:12)(cid:12)(cid:12)(cid:12) = . lmost sure rates of mixing for random intermittent maps 3 Further, future and past correlations are defined as follows. Let ϕ , ψ : I → R be twoobservables on I . Then we define future correlations as Cor f µ ( ϕ , ψ ) : = (cid:12)(cid:12)(cid:12)(cid:12) Z ( ϕ ◦ T n ω ) ψ d µ σ n ω − Z ϕ d µ σ n ω Z ψ d µ ω (cid:12)(cid:12)(cid:12)(cid:12) and past correlations as Cor p µ ( ϕ , ψ ) : = (cid:12)(cid:12)(cid:12)(cid:12) Z ( ϕ ◦ T n σ − n ω ) ψ d µ ω − Z ϕ d µ ω Z ψ d µ σ − n ω (cid:12)(cid:12)(cid:12)(cid:12) . Theorem 1.
Let T ω be the random map described above. Then for almost every ω ∈ Ω there exists a family of absolutely continuous equivariant measures { µ ω } ω on I which is mixing. Moreover, for every δ > there exists a full measure subset Ω ⊂ Ω and a random variable C ω : Ω → R + which is finite on Ω such that forany ϕ ∈ L ∞ ( I ) , ψ ∈ C η ( I ) there exists a constant C ϕ , ψ > so thatCor f µ ( ϕ , ψ ) ≤ C ω C ϕ , ψ n − α + δ and Cor p µ ( ϕ , ψ ) ≤ C ω C ϕ , ψ n − α + δ . Furthermore, there exist constants C > , u ′ > and < v ′ < such thatP { C ω > n } ≤ Ce − u ′ n v ′ . Remark 1.
Notice that in the deterministic setting every mapping in the family F admits an absolutely continuous invariant probability measure, which is polynomi-ally mixing at the rate n − / α if T α ( x ) = x + c α x + α ( + f α ( x )) (see [20], [8]). Inthe random setting the upper bounds we give are arbitrarily close to the sharp decayrates of the fastest mixing system in the family. Since the result holds for almost ev-ery ω ∈ Ω , and in principle there can be arbitrarily long compositions of systems in T n ω whose mixing rates are slower than that of T α it is not expected that the mixingrate of the random system will be the same as the mixing rate of the fastest mixingsystem in the family F and C ω integrable at the same time. Remark 2.
We also remark that we are choosing the family F so that all the mapsin it share the common neutral fixed point 0. If we choose the family by al-lowing different maps having distinct neutral fixed points i.e. T α ( p ( α )) = p ( α ) , T ′ α ( p ( α )) = p ( α ) = ν ) measure set of pa-rameters α ∈ [ α , α ] and expanding elsewhere, then the resulting random map isexpanding on average. Whence one can apply spectral techniques as in [7] on theBanach space of quasi-H¨older functions from [12] or [19] and obtain exponentialdecay rates. Such systems are out of context in our setting since we are after sys-tems with only polynomial decay of correlations.To prove the theorem we construct a random induced map (or Random YoungTower) for T ω with the properties described in [5]. Below we briefly recall the defi-nition of induced map. Marks Ruziboev
Let m denote the Lebesgue measure on I and Λ ⊂ I be a measurable subset.We say T ω admits a Random Young Tower with the base Λ if for almost every ω ∈ Ω there exists a countable partition { Λ j ( ω ) } j of Λ and a return time function R ω : Λ → N that is constant on each Λ j ( ω ) such that(P1) for each Λ j ( ω ) the induced map T R ω ω | Λ j ( ω ) → Λ is a diffeomorphism andthere exists a constant β > ( T R ω ω ) ′ > β .(P2) There exists D > Λ j ( ω ) and x , y ∈ Λ j ( ω ) (cid:12)(cid:12)(cid:12)(cid:12)(cid:12) ( T R ω ω ) ′ x ( T R ω ω ) ′ y − (cid:12)(cid:12)(cid:12)(cid:12)(cid:12) ≤ D β − s ( T R ωω ( x ) , T R ωω ( y )) , where s ( x , y ) is the smallest n such that ( T R ω ω ) n x and ( T R ω ω ) n y lie in distinct ele-ments.(P3) There exists M > ∑ n m { x ∈ Λ | R ω ( x ) > n } ≤ M for all ω ∈ Ω . There exist constants C , u , v > a > b ≥
0, a full measure subset Ω ⊂ Ω , anda random variable n : Ω → N so that ( m { x ∈ Λ | R ω ( x ) > n } ≤ C ( log n ) b n a , whenever n ≥ n ( ω ) , P { n ( ω ) > n } ≤ Ce − un v , (1) Z m { x ∈ Λ | R ω = n } d P ( ω ) ≤ C ( log n ) b n a + . (2)(P4) There are N ∈ N and { t i ∈ Z + | i = , , ..., N } such that g.c.d. { t i } = ε i > ω ∈ Ω and i = , , . . . N we have m { x ∈ Λ | R ω ( x ) = t i } > ε i .Under the above assumptions it is proven in [5] that there exists a family of abso-lutely continuous equivariant measures [5, Theorem 4.1], which is mixing and themixing rates have upper bound of the form n + δ − a for any δ > T R ω ω withthe properties (P1)-(P4), which is carried out in the next section. Here we will construct a uniformly expanding full branch induced random map on Λ = ( , ] for every ω ∈ Ω . Let X ( ω ) = X ( ω ) = x ( ω ) = x α ( ω ) and X n ( ω ) = ( T ω | [ , x ( ω )) ) − X n − ( σω ) for n ≥ . lmost sure rates of mixing for random intermittent maps 5 Let I n ( ω ) = ( X n ( ω ) , X n − ( ω )] . Then by definition T ω ( I n ( ω )) = I n − ( σω ) . By in-duction we have I n ( ω ) T ω −−→ I n − ( σω ) T σω −−→ · · · I ( σ n − ω ) T σ n − ω −−−−→ Λ . Hence, every interval I n ( ω ) first is mapped onto I ( ω ) and then is mapped onto Λ by the next iterate of T ω . Define a return time R ω : ( , ] → N by setting R ω | ( X n ( ω ) , X n − ( ω )] = n . Then the induced full branch map T R ω ω : ( , ] → ( , ] de-fined as T R ω ω | I n ( ω ) = T n ω , for n ≥
1. By assumptions (A1) and (A2) there exists β > T R ω ( ω ) > β for all ω ∈ Ω . In fact, we can choose β = min ω ∈ [ α , α ] min x ∈ [ x ( ω ) , ] | T ′ ω ( x ) | . (3)This proves (P1). By (A1) all the maps in F have two full branches with x α <
1. Hence, the interval where R ω = Proposition 1. ) For every ω ∈ Ω the sequence { X n ( ω ) } n is decreasing and lim n → ∞ X n ( ω ) = . Moreover, there exists a constant C > such that for all ω ∈ Ω C n / α ≤ X n ( ω ) ≤ C n / α . (4)2 ) There exists C , u > , v ∈ ( , ) and a random variable n : Ω → N which is finitefor P -almost every ω ∈ Ω such that P { ω | n ( ω ) > n } ≤ Ce − un v , (5) X n ( ω ) ≤ Cn − / α ( log n ) / α ∀ n ≥ n , (6) Z ( X n − ( ω ) − X n ( ω )) d P ( ω ) ≤ Cn − − / α ( log n ) / α . (7)Now we will prove (P3). For every ω ∈ Ω by definition of R ω and inequality (4)we have m { R ω > n } = X n ( ω ) − x ( ω ) ≤ β X n ( σω ) ≤ C n − / α . Since α < ∑ n − / α < + ∞ and hence, there exists M > ∑ n ≥ m { R ω > n } ≤ M . Inequalities (5) and (6) in Proposition 1 directly imply the inequalities (1) in(P3), and (7) implies inequality (2) in (P3). It remains to show distortion estimates(P2) for the induced map. Our proof is based on Koebe principle. Recall that theSchwarzian derivative of a C diffeomorphism g is defined as Marks Ruziboev Sg ( x ) = g ′′′ ( x ) g ′ ( x ) − (cid:18) g ′′ ( x ) g ′ ( x ) (cid:19) . It can be easily checked that if f and g are two maps such that f ′ ≥ S f < Sg ≤
0, then S ( g ◦ f ) = ( Sg ) ◦ f · f ′ + S f < g ◦ f has negativeSchwarzian derivative. We will use this observation in the proof of Lemma 1.Let J ⊂ J ′ be two intervals and let τ > J ′ is called a τ -scaled neighbourhoodof J if both components of J ′ \ J have length at least τ | J | , where | J | denotes thelength of J . The Koebe principle [16, Chapter IV, Theorem 1.2] states that, if g is adiffeomorphism onto its image with Sg < J ⊂ J ′ are two intervals such that g ( J ′ ) contains τ -scaled neighbourhood of g ( J ) then there exists ˆ K ( τ ) such that forany x , y ∈ J (cid:12)(cid:12)(cid:12)(cid:12) g ′ ( x ) g ′ ( y ) − (cid:12)(cid:12)(cid:12)(cid:12) ≤ ˆ K ( τ ) | x − y || J | . (8)By applying the mean value theorem twice first in J and then in ( x , y ) ⊂ J for any x , y ∈ J we obtain | g ( x ) − g ( y ) || g ( J ) | = | g ′ ( v ) || g ′ ( u ) | | x − y || J | for some u ∈ J , v ∈ ( x , y ) . Now inequality (8) implies that | g ′ ( v ) | / | g ′ ( u ) | ≥ ( + ˆ K ( τ )) − . Thus (cid:12)(cid:12)(cid:12)(cid:12) g ′ ( x ) g ′ ( y ) − (cid:12)(cid:12)(cid:12)(cid:12) ≤ K ( τ ) | g ( x ) − g ( y ) || g ( J ) | , (9)for K ( τ ) = ( + ˆ K ( τ )) ˆ K ( τ ) .Recall that by (A4) the left branch of T ω has negative Schwarzian derivative forall ω ∈ Ω . This fact will be used in the proof of the following lemma. Lemma 1.
There exists K > such that for all ω ∈ Ω , n ∈ N and for x , y ∈ I n ( ω ) (cid:12)(cid:12)(cid:12)(cid:12) ( T n ω ) ′ ( x )( T n ω ) ′ ( y ) − (cid:12)(cid:12)(cid:12)(cid:12) ≤ K | T n ω ( x ) − T n ω ( y ) | . Proof.
Notice, that M = max ω ∈ [ α , α ] max x ∈ I ( ω ) | T ′′ ω ( x ) | < + ∞ by (A5). Also, re-call that T ′ α | I ω > β > T α ∈ F . Thus for n =
1, we have (cid:12)(cid:12)(cid:12)(cid:12) ( T ω ) ′ ( x )( T ω ) ′ ( y ) − (cid:12)(cid:12)(cid:12)(cid:12) ≤ β | ( T ω ) ′ ( x ) − ( T ω ) ′ ( y ) | ≤ M β | T ω ( x ) − T ω ( y ) | . For n ≥ J = [ X n ( ω ) , X n − ( ω )] and J ′ = [ X n + ( ω ) , ] . We first extend T ω n − , . . . , T ω to ( , + ∞ ) analytically, keeping theSchwarzian derivative non-positive . Let g = T ω n − ◦ · · · ◦ T ω . Then, g has negativeSchwarzian derivative. We will show that g ( J ′ ) contains τ scaled neighbourhood of Such extensions can be constructed easily. For example, for f ∈ F it is sufficient to take ˜ f ( x ) = a ( x − x α ) + b ( x − x α ) + c ( x − x α ) + d ( x − x α ) + a < bc / d , where a , b , c are the Taylorcoefficients of f at x = x α .lmost sure rates of mixing for random intermittent maps 7 g ( J ) for some τ >
0, which is independent of ω . Since g ( X n ( ω )) = X ( σ n − ω ) and g ( X n + ( ω )) = X ( σ n − ω ) . It is sufficient to show that X ( ω ) − X ( ω ) is boundedbelow by a constant independent of ω . By definition of X n we have | X ( ω ) − X ( ω ) | = | T − ω ( ) − T − ω ◦ T − σ ( ω ) ( ) | ≥ β ′ | − T − σ ( ω ) ( ) | ≥ κ > , where β ′ = min { T ′ ω ( x ) | ( x , ω ) ∈ [ ˜ X , ] × [ α , α ] } > X = min ω X ( ω ) and κ = β ′ ( − min α x α ) > | g ( J ) | > − max α x α > (cid:12)(cid:12)(cid:12)(cid:12) g ′ ( x ) g ′ ( y ) − (cid:12)(cid:12)(cid:12)(cid:12) ≤ K | g ( x ) − g ( y ) | . with K = K ( τ ) / − max α x α which finished the proof. Lemma 2.
There exists a constant C > independent of ω such that for all ω ∈ Ω and for any x , y ∈ I n ( ω ) (cid:12)(cid:12)(cid:12)(cid:12)(cid:12) log ( T R ω ω ) ′ ( x )( T R ω ω ) ′ ( y ) (cid:12)(cid:12)(cid:12)(cid:12)(cid:12) ≤ C | T R ω ω ( x ) − T R ω ω ( y ) | . Proof.
From now on we suppress the ω in R ω , since no confusion arises. Notethat T R ω ( x ) is the composition of the right branch of T σ R − ω and g i.e. T R ω ( x ) = T σ R − ω ◦ g ( x ) . Therefore, by definition of β in (3) by Lemma 1 we havelog (cid:12)(cid:12)(cid:12)(cid:12) ( T R ω ) ′ ( x )( T R ω ) ′ ( y ) (cid:12)(cid:12)(cid:12)(cid:12) ≤ K | T R ( x ) − T R ( y ) | + K | g ( x ) − g ( y ) | ≤ K ( + β ) | T R ( x ) − T R ( y ) | . Now, we will prove (P2). Together with an elementary inequality | x − | ≤ C | log ( x ) | (for some C >
0, whenever | log x | is bounded above) Lemma 2 implies that for any x , y ∈ I n ( ω ) we have (cid:12)(cid:12)(cid:12)(cid:12) ( T R ω ) ′ x ( T R ω ) ′ y − (cid:12)(cid:12)(cid:12)(cid:12) ≤ D ( K , β ) | T R ( x ) − T R ( y ) | ≤ D β − s ( T R ω ( x ) , T R ω ( y )) , where D = D ( K , β ) is a constant that depends only on K and β and the last in-equality follows from the observation: if x , y ∈ ( , ] are such that s ( x , y ) = n then | x − y | ≤ β − n . Indeed, by definition ( T R ω ) i ( x ) and ( T R ω ) i ( y ) belong to the same ele-ment of the partition { I k ( ω ) } for all i = , ..., n −
1. Thus by the mean value theorem | x − y | = | [( T R ω ) n ] ′ ( ξ ) | − | ( T R ω ) n ( x ) − ( T R ω ) n ( y ) | ≤ β − n . We start by proving an auxiliary lemma, which is used in the proof.
Marks Ruziboev
Lemma 3.
For any k ∈ N , c ≥ and t > we haveE P [ e − ( c α ( σ k ω ) − α ) t ] = α − α e α t ( − c ) ct ( − e − ct ( α − α ) ) . Proof.
Since σ preserves P we have E P [ e − ( c α ( σ k ω ) − α ) t ] = E P [ e − ( c α ( ω ) − α ) t ]= α − α Z α α e − ( cx − α ) t dx = α − α e α t ( − c ) ct ( − e − ct ( α − α ) ) . Proof.
Now we are ready to prove Proposition 1. First we prove item 1). The firsttwo assertions are obvious, since T ′ ( x ) > x > x = [ , / ] . Since all the maps in F are uniformly expanding except at 0,there exists n ∈ N independent of ω such that X n ( ω ) ∈ ( , ε ) for all n ≥ n .Thus, it is sufficient to prove inequality (4) for any n ≥ n . We now define a se-quence { Z n } n which bounds X n ( ω ) from below and has desired asymptotic. Let K = [ , ε ] × [ α , α ] and C = max ( x , α ) ∈ K c α ( + f α ( x )) . Set G ( x ) = x ( + C x α ) .Define { Z n } n ≥ n as follows: Z n = min ω ∈ Ω X n ( ω ) and let Z n = ( G | [ , ε ] ) − ( Z n − ) for n > n . Since G ( x ) ≥ T α ( ω ) ( x ) for any x ∈ [ , ε ] and for any ω ∈ Ω , onecan easily verify by induction that Z n ≤ X n ( ω ) for n ≥ n . Finally note that Z n ∼ n − / α [8]. Defining C ′ = min ( x , α ) ∈ K c α ( + f α ( x )) , G ′ ( x ) = x ( + C ′ x α ) , Z ′ n = max ω ∈ Ω X n ( ω ) and Z ′ n = ( G ′ | [ , ε ] ) − ( Z ′ n − ) for n > n we obtain a sequence { Z ′ n } such that X n ( ω ) ≤ Z ′ n and Z ′ n ∼ n − / α . This finishes the proof.Item 2) is proved below. Note that by the choice of n for any n ≥ n we have X n ( σω ) = X n + ( ω )[ + c α ( ω ) X n + ( ω ) α ( ω ) ( + f α ( ω ) ◦ X n + ( ω ))] . (10)The latter equality together with the standard estimate ( + x ) − a ≤ − ax + a ( a + ) x for x , a > X n + ( ω ) α − X n ( σω ) α ≥ C α X n + ( ω ) α ( ω ) − α − C X n + ( ω ) α ( ω ) − α , where , C = α ( α + ) min ( α , x ) ∈ K [ c α ( + f α ( x ))] . Hence,1 X n ( ω ) α ≥ x α α ( ω ) + C α n ∑ k = X k ( ω ) α ( σ n − k ω ) − α − C n ∑ k = X k ( ω ) α ( σ n − k ω ) − α , Notice that we can take C and C independent of ω . Therefore, by inequality (4)we have1 X n ( ω ) α ≥ + C n ∑ k = ( k / α ) α − α ( σ n − k ω ) − C n ∑ k = ( k / α ) ( − α ( σ n − k ω )+ α ) , (11) lmost sure rates of mixing for random intermittent maps 9 First we will show that the right hand side of the latter inequality on average behaveslike n − log n as n goes to infinity. We set a k : = ( k / α ) α − α ( σ n − k ω ) , b k = ( k / α ) − α ( σ n − k ω )+ α and S n = n ∑ k = C a k − C b k . Lemma 4.
There exists C > such that lim n → ∞ log nn E P ( S n ) = C . Proof.
Applying the above lemma to E P ( e log a k ) with c = u = log k / α and usingthe fact ∑ k ≤ n k ∼ n log n we obtain n ∑ k = E P ( a k ) = α α − α n ∑ k = k ( − k − α − α α ) = α α − α n log n + O ( n − α − α α ( log n ) − ) and hence, log nn n ∑ k = E P ( a k ) = α α − α + O ( n − α − α α ) . (12)Similarly, applying Lemma 3 to E P ( b k ) with c = t = log k / α , we obtain n ∑ k = E P ( b k ) : = α ( α − α ) n ∑ k = k ( k − α α − k α α − ) = α ( α − α ) n − α / α log n + o ( n ) . and hence, lim n → ∞ log nn n ∑ k = E P ( b k ) = lim n → ∞ n − α / α = . (13)Combining (12) and (13) implieslim n → ∞ log nn E P ( S n ) = lim n → ∞ log nn n ∑ k = E P ( C a k − C b k ) = C , where C = C α / ( α − α ) .Now we construct a random variable n : Ω → N as in item 2) of Proposition 1.Lemma 4 implies that there exists N independent of ω such that C ≤ log nn E P ( S n ) ≤ C n ≥ N . On the other hand, by [11, Theorem 1], there exists C > t > n ∈ N we have P (cid:26) log nn | S n + − E P ( S n + ) | < t (cid:27) ≤ e − Cnt ( log n ) . Thus, by letting C = CC /
16 we obtain P (cid:26) log nn S n + < C (cid:27) ≤ P (cid:26) log nn ( S n + − E P S n + ) < − C (cid:27) ≤ e − C n ( log n ) . (15)Define n ( ω ) = inf { n ≥ N | ∀ k ≥ n , log kk S k ≥ C } . Inequality (15) implies that P { n ( ω ) > n } ≤ ∞ ∑ k = n e − C k ( log k ) ≤ C ∞ ∑ k = n e − uk v ≤ Ce − un v for some C > u > v ∈ ( , ) which proves inequality (5).For any n ≥ n by (11) we have X n ( ω ) α ≤ log nn C . Hence, for some positive C > X n ( ω ) ≤ C (cid:18) log nn (cid:19) / α . This finishes the proof of (6). It remains to prove (7). Recall that there exists n which depends only on ε in (A3) such that (10) holds for all n ≥ n . Thus, recallingthat σ preserves P we have Z m { R ω = n } d P ( ω ) ≤ β Z ( X n − ( σω ) − X n ( σω )) d P ( ω ) = β Z ( X n − ( σω ) − X n ( ω )) d P ( ω ) = Z { n ( ω ) > n } ( X n − ( σω ) − X n ( ω )) d P ( ω ) + Z { n ( ω ) ≤ n } ( X n − ( σω ) − X n ( ω )) d P ( ω ) ≤ Ce − un v + Z { n ( ω ) ≤ n } c α ( ω ) X n ( ω ) α ( ω )+ ( + f α ( ω ) ◦ X n ( ω ) d P ( ω ) ≤ Ce − un v + C Z (cid:18) log nn (cid:19) ( α ( ω )+ ) / α d P ( ω ) ≤ C (cid:18) log nn (cid:19) ( α + ) / α . This finishes the proof for all n ≥ n . For n < n the assertion follows by increasingthe constant C if necessary. Acknowledgements
This research was supported by The Leverhulme Trust through the researchgrant RPG-2015-346. The author would like to thank Wael Bahsoun for useful discussions duringthe preparation of the paper.lmost sure rates of mixing for random intermittent maps 11
References
1. Aimino R., Hu H., Nicol M., T¨or¨ok A., Vaienti S.: Polynomial loss of memory for maps ofthe interval with a neutral fixed point. Discrete Contin. Dyn. Syst. , no. 3, 793–806 (2015)2. Ayyer A, Liverani C., Stenlund M.: Quenched CLT for random toral automorphisms. DiscreteContin. Dyn. Syst. , no. 2, 331–348 (2009)3. Bahsoun W., Bose C.: Mixing rates and limit theorems for random intermittent maps. Nonlin-earity. , no. 4, 1417–1433 (2016)4. Bahsoun W., Bose C., Duan Y.: Decay of correlation for random intermittent maps. Nonlin-earity. , no. 7, 1543–1554 (2014)5. Bahsoun W., Bose C., Ruziboev M.: Quenched decay of correlations for slowly mixing sys-tems , Available via https://arxiv.org/abs/1706.04158
Cited 30 Jan 2018.6. Baladi V., Benedicks M., Maume-Deschamps V.: Almost sure rates of mixing for i.i.d. uni-modal maps. (English, French summary) Ann. Sci. ´Ecole Norm. Sup. (4) , no. 1, 77–126(2002)7. Buzzi J.: Exponential decay of correlations for random Lasota-Yorke maps. Comm. Math.Phys. , no. 1, 25–54 (1999)8. Gou¨ezel S: Sharp polynomial estimates for the decay of correlations. Israel J. Math. , 29–65 (2004)9. Haydn N., Nicol M., T¨or¨ok A., Vaienti S.: Almost sure invariance principle for sequential andnon-stationary dynamical systems. Trans. Amer. Math. Soc. , no. 8, 5293–5316 (2017)10. Haydn N., Rousseau J., Yang F.: Exponential law for random maps on compact manifolds.
Available via https://arxiv.org/abs/1705.05869
Cited 16 May 201711. Hoeffding W.: Probability inequalities for sums of bounded random variables J. Amer. Stat.Soc. , no. 30, 13–30 (1963)12. Keller G.: Generalized bounded variation and applications to piecewise monotonic transfor-mations. Z. Wahr. Verw. Geb. , 461–478 (1985)13. Kifer Y.: Limit theorems for random transformations and processes in random environments.Trans. Amer. Math. Soc. , no. 4, 1481–1518 (1998)14. Lepp¨anen J., Stenlund M.: Quasistatic dynamics with intermittency. Math. Phys. Anal. Geom. , no. 2, Art. 8, 23 pp (2016)15. Liverani C., Saussol B., Vaienti S.: A probabilistic approach to intermittency, Ergodic TheoryDynam. Systems , 671–685 (1999)16. de Melo W., van Strien S.: One-dimensional dynamics . (Springer-Verlag, Berlin, 1993)17. Nicol M., T¨or¨ok A., Vaienti S.: Central limit theorems for sequential and ran-dom intermittent dynamical systems. Ergodic Theory Dynam. Systems. doi:https://doi.org/10.1017/etds.2016.6918. Pomeau Y., Manneville P.: Intermittent transition to turbulence in dissipative dynamical sys-tems. Comm. Math. Phys. , 189–197 (1980)19. Saussol B.: Absolutely continuous invariant measures for multidimensional expanding maps.Israel J. Math. , 223–248 (2000)20. Young L-S. : Recurrence times and rates of mixing. Israel J. Math.110