aa r X i v : . [ m a t h . D S ] J u l BOUNDARY ENTROPY SPECTRA AS FINITE SUBSUMS
HANNA OPPELMAYER
Abstract.
In this paper we provide a concrete construction of Furstenberg entropy valuesof τ -boundaries of the group Z [ p , . . . , p l ] ⋊ { p n · · · p n l l : n i ∈ Z } by choosing an appropriaterandom walk τ . We show that the boundary entropy spectrum can be realized as the subsum-set for any given finite sequence of positive numbers. Contents
1. Introduction 11.1. Organization of the paper and ingredients of the proof of Theorem 1.1 21.2. Acknowledgments 32. Preliminaries 33. Entropy observations 43.1. Absolutely continuous measures provide equal information “on average” 43.2. An entropy formula for certain measures 54. An application 64.1. A construction of absorbing measures to design entropy 84.2. Proof of Theorem 1.1 10References 101.
Introduction A random walk on a countable infinite discrete group Γ is given by a probability measure τ on Γ, whose support generates the group as a semi-group. In 1963 Furstenberg introduced in[6] so-called boundaries for random walks. These are τ -stationary standard probability spaces( Y, η ) on which Γ acts measurably such that for τ N N -almost every sequence ( γ i ) i ∈ N ∈ Γ N the image measures γ · · · γ n η converge on a compact model to a Dirac measure. To quantifythese spaces, Furstenberg introduced a notion of entropy defined as h τ ( Y, η ) := Z Γ Z Y − log (cid:16) dγ − ηdη ( y ) (cid:17) dη ( y ) dτ ( γ ) . The boundary entropy spectrum and the entropy spectrum of (Γ , τ ) are defined asBndEnt(Γ , τ ) := { h τ ( Y, η ) : (
Y, η ) a τ -boundary } and Ent(Γ , τ ) := { h τ ( Y, η ) : (
Y, η ) a τ -stationary, ergodic probability Γ-space } . The latter has been intensively studied, e.g. Lewis Bowen proved in [3] that the free group F r for r ≥ υ has a full entropy realization ,i.e. Ent( F r , υ ) = [0 , h ( F r , υ )] where h (Γ , τ ) := P γ ∈ Γ − log( τ ( γ )) τ ( γ ) denotes the random Mathematics Subject Classification.
Primary: 37A50; Secondary: 05C81, 58J51.
Key words and phrases.
Random walks on groups, boundary entropy spectrum. walk entropy . The random walk entropy is always a natural upper bound for the entropyspectrum. Nevo showed in [10] that an infinite countable group with property ( T ) will alwayshave a so-called entropy gap for any generating probability measure on the group, that is thereis ǫ > ǫ has entropy zero. Examples of groups without an entropy gap are for instance finitelygenerated virtually free groups with generating probability measures which have a finite firstmoment, shown by Hartman-Tamuz in [7]. The question about the shape of Ent(Γ , τ ) is stillwidely open for many measured groups.Here we shall investigate the shape of BndEnt(Γ , τ ) for a concrete group Γ where we choosedifferent measures τ to design various boundary entropy spectra. Note thatBndEnt(Γ , τ ) ⊆ Ent(Γ , τ ) . In our previous work [2], joint with Michael Bj¨orklund and Yair Hartman, we explicitlyconstructed random walks on a certain class of groups such that their boundary entropyspectra realize any given subsum sets. In particular, we constructed two measures τ and τ on the same group Γ such that BndEnt(Γ , τ ) = [0 , Poi(Γ , τ )] while BndEnt(Γ , τ ) isa Cantor set. The groups considered there were countable infinite direct sums of certaincountable groups. In this short note we consider a group which is not a countable infinitedirect sum and prove a certain subsum-realization for the boundary entropy spectrum usingslightly different methods.Given finitely many distinct primes, p , . . . , p l , we study the groupΓ := Z h p , . . . , p l i ⋊ { p n · · · p n l l : n i ∈ Z } . Theorem 1.1.
Let β , . . . , β l be positive real numbers. Then there exists a finitely supportedgenerating probability measure τ on Γ such that BndEnt(Γ , τ ) = n X j ∈ J β j : J ⊆ { , . . . , l } o . Organization of the paper and ingredients of the proof of Theorem 1.1
In Section 3 we express the entropy of probability measures which are absolutely continu-ous w.r.t. σ -finite product measures as sums of averaged information functions of the singlemeasures of the product. Our proof uses a result by Anosov in [1]. Further in Corollary 4.5we will see that the boundaries in concern meet the requirements for these entropy formu-lae. This is done by passing to a homogeneous setting of a certain locally compact, totallydisconnected measured group by techniques we have developed in our previous work [2] withMichael Bj¨orklund and Yair Hartman. We will apply a theorem by Brofferio in [4], statingthat the “maximal” boundary - the so-called Poisson boundary
Poi(Γ , τ ) - of a finitely gen-erated subgroup Γ of the affine group
Q ⋊ Q ∗ equipped with a finitely supported, generatingprobability measure τ can be realized asPoi(Γ , τ ) = (cid:16) Y φ p ( τ ) < Q p , ν (cid:17) where ν is the uniquely τ -stationary probability measure on that space and φ p ( τ ) denotesthe p -drift for τ (see Section 4 for a definition) running over all primes p including infinity(with Q ∞ := R ). Note that the above product could be empty as well - e.g. if τ is symmetric- in which case we would see the one-point-space with the Dirac measure. OUNDARY ENTROPY SPECTRA AS FINITE SUBSUMS 3
Acknowledgments
I am very grateful to my supervisor Michael Bj¨orklund for pointing out Anosov’s statementin Lemma 3.2 and sharing many ideas which are crucial for the proofs in this paper. MoreoverI would like to thank Yair Hartman for supporting me in many interesting discussions.2.
Preliminaries
Let G be a locally compact, second countable group and fix a left-Haar measure m G on G .Throughout let µ be a probability measure on the Borel σ -algebra of G which is admissible ,that is, it is generating in the sense that the semi-group generated by the support of µ isdense in G and it is spread-out meaning there exists some convolution power of µ whichis absolutely continuous w.r.t. m G . A standard probability spaces ( X, ν ) on which G actsjointly Borel measurably is called G -space . The action is non-singular if gν and ν have thesame null-sets for every g ∈ G . For these kind of actions Furstenberg [6] defined a notion ofentropy h µ ( X, ν ) = Z G Z X − log (cid:16) dg − νdν ( x ) (cid:17) dν ( x ) dµ ( g )provided that the information function I ν : ( g, x ) log (cid:16) dg − νdν ( x ) (cid:17) lies in L ( G × X, µ ⊗ ν ). For our purposes we may extend the definition of the informationfunction to σ -finite measures on X .A measure ν on a measurable G -space ( X, B ) is called µ -stationary if ν = µ ∗ ν , i.e. ν ( A ) = Z X ν ( g − A ) dµ ( g )for all A ∈ B . Note that the assumption on ( G, µ ) to be admissible implies that every µ -stationary measure is in particular non-singular (see e.g. [11, Lemma 1.2]). Moreover µ -stationary probability measures give rise to measure-preserving transformations on thefollowing product space. Lemma 2.1 ([5, Proposition 1.3]) . Let ν be a µ -stationary probability measure on X . Then T : G N × X given by T (( g n ) n ≥ , x ) := (( g n ) n ≥ , g x ) is a measurable map which preserves themeasure µ ⊗ N ⊗ ν . The above introduced notion of entropy is of particular interest for the following spaces.
Definition 2.2.
A compact probability space (
X, ν ) on which G acts continuously with ν being µ -stationary is called a compact µ -boundary if g · · · g n ν → δ x (( g i ) i ≥ ) for µ ⊗ N -almost every ( g i ) i ≥ ∈ G N in the weak*-topology, with x (( g i ) i ≥ ) ∈ X . A probability space which is measure-theoretical G -equivariantly isomorphic to a compact µ -boundary is called a µ -boundary . The Poissonboundary is the maximal such boundary in the sense that every other µ -boundary is a G -factor of it. (Its existence is provied by a result of Furstenberg, see e.g. [6]). We will writePoi( G, µ ) to denote this space.
HANNA OPPELMAYER Entropy observations
Absolutely continuous measures provide equal information “on average”
Let G be a locally compact, second countable group equipped with an admissible proba-bility measure µ , and let ( X, ν ) be a G -space. Proposition 3.1.
Let ν be a µ -stationary probability measure on X and let ξ be a non-singular σ -finite measure on X . If ν ≪ ξ and if I ν and I ξ are both in L ( µ ⊗ ν ) , then Z G Z X I ν ( g, x ) dν ( x ) dµ ( g ) = Z G Z X I ξ ( g, x ) dν ( x ) dµ ( g ) . The proof of the above Proposition 3.1 relies on an old observation by Anosov [1] and willbe provided at the end of this subsection.
Lemma 3.2 ([1, Theorem 1]) . Let ( Z, Z , ζ ) be a probability space and T : Z → Z be ameasure-preserving transformation on Z , i.e. ζ ◦ T − = ζ . Given a measurable function f : Z → R such that f − f ◦ T lies in L ( Z, ζ ) , then Z Z ( f − f ◦ T )( z ) dζ ( z ) = 0 . Proof.
Let us denote h := f ◦ T − f . By Birkhoff’s Ergodic Theorem for ζ -a.e. z ∈ Z E [ h |I ( T )]( z ) = lim n →∞ n n − X k =0 h ( T k ( z )) = lim n →∞ n ( f ( T n ( z ) − f ( z )) = lim n →∞ n f ( T n ( z )) , where I ( T ) denotes the σ -algebra of T -invariant sets in Z . Moreover, for every ǫ > ζ (cid:16)n z ∈ Z : (cid:12)(cid:12)(cid:12) f ( T n ( z )) n (cid:12)(cid:12)(cid:12) > ǫ o(cid:17) = ζ (cid:16)n z ∈ Z : (cid:12)(cid:12)(cid:12) f ( z ) n (cid:12)(cid:12)(cid:12) > ǫ o(cid:17) → T -invariance of ζ . Hence (by using Borel-Cantelli) there exists a subsequence ( n l ) l ∈ N suchthat f ( T nl ( z )) n → l → ∞ , for ζ -a.e. z ∈ Z . Thus0 = E [ h |I ( T )]and in particular 0 = Z Z h dζ since E [ h |I ( T )] ∈ L ( Z, ζ ). (cid:3) Corollary 3.3.
Let ν be a µ -stationary probability measure on X . For a measurable function u : X → R such that f ( g, x ) := u ( x ) − u ( gx ) belongs to L ( G × X, µ ⊗ ν ) we have Z G Z X f ( g, x ) dν ( x ) dµ ( g ) = 0 . Proof.
Since ν is µ -stationary, by Lemma 2.1 we can consider the measure-preserving trans-formation T : G N × X defined as T (( g n ) ∞ n =1 , x ) := (( g n ) ∞ n =2 , g x ). Let us set e u := u ◦ pr X where pr X : G N × X → X is the projection to X . Then clearly, for g = g we can write u ( x ) − u ( gx ) = e u ( x ) − e u ( T (( g, g , g , . . . ) , x )) . Now to apply Lemma 3.2, it is left to show that e u − e u ◦ T ∈ L ( G N × X, µ ⊗ N ⊗ ν ) . Indeed, e u does only depend on the first coordinate of G and by assumption u − u ◦ g ∈ L ( G × X, µ ⊗ ν ), which ends the proof. (cid:3) OUNDARY ENTROPY SPECTRA AS FINITE SUBSUMS 5
Proof of Proposition 3.1.
First note that u := dνdξ > ν -almost everywhere. So I ν ( g, x ) = log (cid:16) u ( g ( x )) dg − ξu ( x ) dξ ( x ) (cid:17) = − log( u ( x )) + log( u ( g ( x )) + I ξ ( g, x )for µ ⊗ ν -a.e. ( g, x ) ∈ G × X . By Corollary 3.3 we see that Z G × X log( u ( x )) − log( u ( g ( x )) dµ ⊗ ν ( g, x ) = 0because log ◦ u − log ◦ u ◦ g = I ν − I ξ ∈ L ( G × X, µ ⊗ ν ) by assumption. Thus we obtain Z G × X I ν dµ ⊗ ν = Z G × X I ξ dµ ⊗ ν. (cid:3) An entropy formula for certain measures
In case when the µ -stationary probability measure ν is absolutely continuous w.r.t. to aproduct measure, we can rewrite the entropy as sums of averaged information functions ofevery single measure of the product. Proposition 3.4.
Let X = Q li =1 X i and G act diagonally on X such that ν is µ -stationary.Given non-singular σ -finite measures ξ i on X i such that ν ≪ N li =1 ξ i , then h µ ( X, ν ) = l X i =1 Z G Z X I ξ i ( g, x i ) dν (( x n ) ln =1 ) dµ ( g ) provided I ν , I ξ i ∈ L ( µ ⊗ ν ) for all i = 1 , . . . , l .In particular, if for every g ∈ G , for every i ∈ { , . . . , l } there exists a ξ i -conull set X ′ i ⊆ X i such that dg − ξ i dξ i ( x i ) = ∆ i ( g ) for all x i ∈ X ′ i , then h µ ( X, ν ) = l X i =1 Z G − log(∆ i ( g )) dµ ( g ) . The proof of the above proposition follows directly from Proposition 3.1 and the next basicfact about Radon-Nikodym derivatives on product spaces.
Lemma 3.5.
Let η i and ζ i be σ -finite measures on Y and Z , respectively, for i = 1 , suchthat η ≪ η and ζ ≪ ζ . Then d ( η ⊗ ζ ) d ( η ⊗ ζ ) ( y, z ) = dη dη ( y ) dζ dζ ( z ) . Proof of Proposition 3.4.
By Proposition 3.1 we know that h µ ( X, ν ) = Z G Z X − log (cid:16) dg − N li =1 ξ i d N li =1 ξ i ( x ) (cid:17) dν ( x ) dµ ( g ) . Since the action is diagonal we can apply Lemma 3.5 and obtain h µ ( X, ν ) = l X i =1 Z G Z X − log (cid:16) dg − ξ i dξ i ( x i ) (cid:17) dν ( x ) dµ ( g ) . (cid:3) HANNA OPPELMAYER An application
In this section we consider the groupΓ = Z h p , . . . , p l i ⋊ S where S := { p n · · · p n l l : n i ∈ Z } for given distinct primes p , . . . , p l .As before we denoteBndEnt(Γ , τ ) := { h τ ( X, ν ) : (
X, ν ) a (Γ , τ )-boundary } . Let us recall our main goal of this note, namely Theorem 1.1:
Theorem.
For any finite sequence of positive real numbers β , . . . , β l there exist a finitelysupported generating probability measure τ on Γ such that BndEnt(Γ , τ ) = n X j ∈ J β j : J ⊆ { , . . . , l } o . Following [4] the p -drift of a measure τ on Γ is defined as φ p ( τ ) := X s ∈ S pr τ ( s ) log( | s | p )for a prime p . We say that τ has negative drift if its p -drift is negative for every p ∈{ p , . . . , p l } . Theorem 4.1 (in [4, Theorem 1]) . Let τ be a finitely supported, generating probability mea-sure on Γ = Z [ p , . . . , p l ] ⋊ { p n · · · p n l l : n i ∈ Z } with negative drift. Then the Poissonboundary of (Γ , τ ) is given by ( Q p × . . . × Q p l , ν ) where ν is the unique τ -stationary probability measure on this space. To prove Theorem 1.1 we will view Q p × . . . × Q p l as a homogeneous space w.r.t. acompletion of Γ. We can diagonally embed ρ : Γ ֒ → G := Q p × . . . × Q p l ⋊ S where S denotes { p n · · · p n l l : n i ∈ Z } . Note that the above embedding has a dense image,since Z [ p , . . . , p l ] is dense in Q p i for every p i ∈ { p , . . . , p l } . Indeed, by the Theorem ofStrong Approximation (e.g. [12, Theorem 5.8]), for every a i ∈ Q p i , ∀ ǫ > x ∈ Q such that k x − a i k p i < ǫ ∀ i = 1 , . . . , l and k x k p ≤ p except { p , . . . , p l } . Thelatter property implies that x ∈ Z [ p , . . . , p l ].Let us set X := Q p × . . . × Q p l . Then, naturally, G y X continuously and X ∼ = G/ ˆ S for ˆ S := { } ⋊ S becomes a homogeneousspace w.r.t. the action of G . Lemma 4.2.
Let ν be a non-singular probability measure on X . Then every G -factor of ( X, ν ) is of the form (cid:16) Y j ∈ J Q p j , π J ν (cid:17) for some index subset J ⊆ { , . . . , l } , where π J : X → Q j ∈ J Q p j denotes the given factormap. OUNDARY ENTROPY SPECTRA AS FINITE SUBSUMS 7
Proof.
Let (
Y, πν ) be a measurable G -factor of ( X, ν ) with surjective G -equivariant factormap π . Since X = G/ ( { } × S ), there exists a closed subgroup C of G which contains { } × S such that ( Y, πν ) ∼ = ( G/C, φν ) as G -spaces (mod 0), where φ : G/ ( { } × S ) → G/C denotes the canonical factor map, confer e.g. [9, Chapter 4, Section 2, Proposition 2.4 (b)].Now, if C is properly larger than { } × S , then there exists ( r, s ) ∈ C with r = (0 , . . . , X , i.e. there is some j ∈ { , . . . , l } such that the j -th coordinate r j of r is non-zero in Q p j , then since ( e sr,
1) = (0 , e s )( r, s )(0 , s − e s − ) ∈ C for any e s ∈ S we obtain that the j -thcoordinate in C contains all elements of the group generated by { e sr j : e s ∈ S } which isexactly r j Z [ p , . . . , p l ]. Since C is closed and Z [ p , . . . , p l ] is dense in Q p j we obtain that the j -th coordinate of C has to be whole Q p j . The same argument goes through for any non-zerocoordinate entry of C such that in the end we obtain Y ∼ = G/ ( Q j ∈ J c Q p j ) ∼ = Q j ∈ J Q p j as G -spaces for some index set J ⊆ { , . . . , l } , such that the measure πν on Y is transformed topr J ν on Q j ∈ J Q p j . (cid:3) In order to relate now the boundary theory of (Γ , τ ) with those of (
G, µ ) we need to requiresome relations for the measures in charge. For this end let us consider the following subgroupsΛ :=
Z ⋊ { } ≤ Γ and K := Z p × . . . × Z p l ⋊ { } . We remark that (Γ , Λ) is a so-called reduced Hecke pair - i.e. has finite Λ-orbits on Γ / Λ - and(
G, K ) its Schlichting completion, see e.g. [8] for details and definitions. Important to noticeis that the coset spaces Γ / Λ and
G/K are Γ-equivariant isomorphic. Such an isomorphismis for instance given by θ : Γ / Λ −→ G/K with γ Λ ρ ( γ ) K , see [8, Proposition 3.9].Given a probability measure τ on Γ we want to construct a probability measure µ τ on G suchthat they coincidence on the coset spaces Γ / Λ and
G/K , in a sense that τ ( γ Λ) = µ τ ( ϑ ( γ Λ)) , ∀ γ Λ ∈ Γ / Λ . (4.1)Given τ ∈ Prob(Γ) we set µ τ ( f ) := X γ Λ ∈ Γ / Λ τ ( γ Λ) Z K f ( ρ ( γ ) k ) dm K ( k ) (4.2)for f : G → R measurable, where m K denotes the Haar measure on K with m K ( K ) = 1. Theabove is well-defined due to left- ρ (Λ)-invariance of m K . Remark that equation 4.1 is indeedfulfilled in this construction.Measures which are Λ-invariant on the coset spaces Γ / Λ provide a theory of relating (
G, µ τ )-stationary spaces to (Γ , τ )-stationary ones. As in our perivious work [2] we may give thesemeasures a name. Definition 4.3.
We call a probability measure τ on Γ Λ -absorbing if τ ( λγ Λ) = τ ( γ Λ) , ∀ λ ∈ Λfor all γ ∈ Γ.Using these measures in Brofferio’s result in Theorem 4.1 we obtain now a homogeneoussetting. Let us fix a Haar measure m Q pi on Q p i and let µ τ denote the probability measureon G in equation 4.2 corresponding to τ . Corollary 4.4.
Let τ be a finitely supported, generating probability measure on Γ which is Λ -absorbing and has negative drift. Then Poi(
G, µ τ ) = Poi(Γ , τ ) = (cid:16) l Y i =1 Q p i , ν (cid:17) HANNA OPPELMAYER where ν is the unique τ -stationary and unique µ τ -stationary probability measure on Q li =1 Q p i .Moreover ν ≪ N li =1 m Q pi .Proof. Let ν be the unique τ -stationary probability measure given by Theorem 4.1. It is nothard to show that every µ τ -stationary probaility measure is τ -stationary (see [2, Theorem 1.6(P3)]). Thus there can be at most one µ τ -stationary probability measure on Q p × . . . × Q p l and it has to coincide with ν . The existence of a µ τ -stationary probability measures canbe shown along the same lines as Brofferio’s construction in [4] for τ -stationary probabilitymeasures. Note that µ τ is generating for G since τ is generating Γ ([2, Theorem 1.6 (P2)]).In particular ν is G -quasi-invariant. Since on homogeneous spaces there is a unique G -quasi-invariant measure class (see e.g. [13, Theorem 5.19]), we obtain that ν ≪ N li =1 m Q pi . (cid:3) In [2, Corollary 4.15], it is shown that ever Γ-factor of a G -ergodic space is (Γ-equivarinat)isomorphic to a G -factor, thus from Lemma 4.2 and Corollary 4.4 we can conclude Corollary 4.5.
Let τ be a Λ -absorbing, finitely supported, generating probability measure on Γ with negative drift. Then all τ -boundaries of Γ are of the form ( Y j ∈ J Q p j , η J ) for some index set J ⊆ { , . . . , l } where Γ acts on Q j ∈ J Q p j via the embedding in G and η J ≪ N j ∈ J m Q pj . The above Corollary 4.4 together with the entropy observation in Proposition 3.4 willyield to a proof of Theorem 1.1, provided we construct τ such that it is finitely supported,Λ-absorbing and has negative drift. Let us do this construction for more general semi-directsproducts in the next subsection.4.1. A construction of absorbing measures to design entropy
Here R shall be a countable ring with 1 and S a multiplicative subgroup of R . Let usconsider a Hecke pair (Γ , Λ) of the formΓ = R ⋊ S and Λ = U ⋊ { e } where U is an additive subgroup of R . Note that (Γ , Λ) is a Hecke pair iff [ U : sU ∩ U ] < ∞ for every s ∈ S .We will construct a finitely supported, generating and Λ-absorbing probability measure τ on Γ that is Λ-invarinat and of the following form τ ( r, s ) = κ s ( r ) ι ( s ) (4.3)for probability measures κ s on R and ι on S . In order to obtain that τ is Λ-absorbing, i.e. τ (( u, e )( r, s )Λ) = τ (( r, s )Λ) , ∀ u ∈ U we need κ s ( u + r + sU ) = κ s ( r + sU ) , ∀ u ∈ U (4.4)for every r ∈ R and s ∈ S (with ι ( s ) = 0). To this end let V s ⊆ R be a set of representativesfor U/ ( sU ∩ U ) with 0 ∈ V s , where the cosets are taken w.r.t. addition. By the Heckeassumption V s is finite for every s ∈ S . Similar as in [2, Remark 1.15], we may thus constructΛ-absorbing probability measure in the following way. Lemma 4.6.
Let κ be a probability measure on R . Then κ s ( t ) := 1 | V s | X v s ∈ V s κ ( t + v s ) , t ∈ R OUNDARY ENTROPY SPECTRA AS FINITE SUBSUMS 9 is a probability measure on R fulfilling equation 4.4 for every s ∈ S . In particular, if Γ isfinitely generated, then there exists a finitely supported, generating, Λ -absorbing probabilitymeasure τ on Γ of the form 4.3.Proof. We need to show that κ s is U -invariant on every coset r + sU , hence that X v s ∈ V s κ ( u + r + sU + v s ) = X v s ∈ V s κ ( r + sU + v s ) for every u ∈ U. Let W s be a set of representatives of sU/ ( sU ∩ U ). In particular sU = G w s ∈ W s ( w s + sU ∩ U ) and U = G v s ∈ V s ( v s + sU ∩ U ) , hence G v s ∈ V s ( sU + v s ) = G v s ∈ V s G w s ∈ W s ( w s + sU ∩ U + v s ) = G w s ∈ W s ( w s + U ) . This implies P v s ∈ V s κ ( u + r + sU + v s ) = P w s ∈ W s κ ( r + U + w s ) = P v s ∈ V s κ ( r + sU + v s ) , which ends the proof of the first statement.It is left to show that we can choose ι and κ such that τ given by equation 4.3 is finitelysupported and generating, whenever Γ is finitely generated. Let E be a finite set of generatorsof Γ. We shall find finitely supported probability measures ι on S and κ on R such thatsupp( τ ) = { ( r, s ) : r ∈ supp( κ s ) , s ∈ supp( ι ) } ⊇ E. (4.5)First, we choose a finitely supported probability measure ι on S such thatsupp( ι ) ⊇ pr S ( E ) . (4.6)Such a measure always exists, for instance the uniform measure on pr S ( E ). Further, for every s ∈ S we see that supp( κ s ) ⊇ supp( κ ) because 0 ∈ V s by assumption. Thus to obtain 4.5it suffices to take κ to be a finitely supported probability measure with support containingpr R ( E ), for example the uniform probability measure on pr R ( E ). (cid:3) Now let us go back to the case R = Z [ 1 p , . . . , p l ] and S = { p n · · · p n l l : n i ∈ Z } . Lemma 4.7.
Given β , . . . , β l > we can find a finitely supported, generating, Λ -absorbingprobability measure τ on Γ such that φ p i ( τ ) = − β i for any i = 1 , . . . , l .Proof. We will construct τ as in equation 4.3. Recall that ( S, · ) ∼ = ( Z l , +). We may thus set ι to be a product measure ι = l O i =1 ι i with ι i being probability measures on Z . With τ as in equation 4.3 we thus obtain φ p i ( τ ) = X n ∈ N ι i ( n ) n log( p i ) . Hence we shall find finitely supported, generating probability measures ι i such that X n ∈ Z ι i ( n ) n log( p i ) = β i log( p i ) =: α. (4.7) Let σ be a finitely supported, generating probability measure on Z with non-zero mean E [ σ ] = P n ∈ Z σ ( n ) n , for example σ = δ + δ − . For every i choose N i such that β i E [ σ ] < N i .By setting now ι i := β i E [ σ ] N i σ ∗ N i + (cid:16) − β i E [ σ ] N i (cid:17) δ we obtain an instance of a generating, finitely supported probability measure fulfilling equa-tion 4.7 since E [ σ ∗ N i ] = N i E [ σ ].Recall that in order to obtain τ ( r, s ) := κ s ( r ) ι ( s ) to be Λ-absorbing, finitely supported andgenerating, we have seen in the proof of Lemma 4.6, that the only constraint on ι is thatsupp( ι ) ⊆ pr S ( E ) (equation 4.6) for a finite generating set E of Γ. To guarantee this wecan just choose the above numbers N i so large that Q li =1 supp( σ ∗ N i ) ⊇ pr S ( E ) which givessupp( ι ) ⊇ pr S ( E ) since supp( ι i ) = supp( σ ∗ N i ) ∪ { } . (cid:3) Proof of Theorem 1.1
Let Γ, G , Λ as in Theorem 1.1 and set X := Q p × . . . × Q p l . Proof of Theorem 1.1.
We start with the ansatz τ ( r, s ) := κ s ( r ) l O i =1 ι i ( s )where we identify ( S, · ) ∼ = ( Z l , +) to let ι = N li =1 ι i be a product measure on S with prob-ability measures ι i on Z . By Lemma 4.6 we can choose probability measures ι i on Z and κ s on Z [ p , . . . , p l ], such that τ becomes a Λ-absorbing, finitely supported and generatingprobability measures with the property that φ p i ( τ ) = − β i , ∀ i. (4.8)In particular the drift is negative and thus by Corollary 4.5 we know that every τ -boundaryis of the form ( Y J , η J ) := ( Y j ∈ J Q p j , π J ν )for some index set J ⊆ { , . . . , l } for ν given by Theorem 4.1 with π J ν ≪ N j ∈ J m Q pj . Thusby Proposition 3.4 we obtain h τ ( Y J , η J ) = X j ∈ J Z Γ − log( | s | p j ) dτ ( r, s ) = X j ∈ J − φ p j since d ( r, s ) − m Q pj dm Q pj = | s | p j . Thus by equation 4.8 we obtain h τ ( Y J , η J ) = P j ∈ J β j . (cid:3) References
1. Dimitry V. Anosov,
On an additive functional homology equation connected with an ergodic rotation ofthe circle.
Math. USSR-Izv. 7, no. 6 (1973): 1257-1271.2. Michael Bj¨orklund, Yair Hartman, Hanna Oppelmayer,
Random walks on dense subgroups oflocally com-pact groups , pre-print, 2020. (arXiv:2006.15705v1)3. Lewis Bowen,
Random walks on random coset spaces with applications to Furstenberg entropy.
Invent.math. 196 (2014), 485-510.4. Sara Brofferio,
Poisson boundary for finitely generated groups of random rational affinities.
Journal ofMathematical Sciences volume 156 (2009): 1-10.
OUNDARY ENTROPY SPECTRA AS FINITE SUBSUMS 11
5. Alex Furman,
Random walks on groups and random transformations.
Handbook of dynamical systems,Vol. 1A, 931-1014, North-Holland, Amsterdam, 2002.6. Hillel Furstenberg,
Noncommuting random products.
Trans. Amer. Math. Soc. 108 (1963) 377-428.7. Yair Hartman, Omer Tamuz,
Furstenberg entropy realizations for virtually free groups and lamplightergroups.
JAMA 126 (2015), 227-257.8. Steven Kaliszewski, Magnus B. Landstad, John Quigg,
Hecke C*-Algebras, Schlichting Completions andMorita Equivalence.
Proceedings of the Edinburgh Mathematical Society, 51(3) (2008), 657-695.9. Gregori A. Margulis,
Discrete Subgroups of Semisimple Lie Groups.
Springer-Verlag, Berlin-Heidelberg-New York, 1991.10. Amos Nevo,
The spectral theory of amenable actions and invariants of discrete groups.
Geometriae Dedi-cata 100 (2003), no. 1, 187-218.11. Amos Nevo, Robert J. Zimmer,
Rigidity of Furstenberg entropy for semisimple Lie group actions.
Annalesscientifiques de l’ E.N.S., 4th series, 33(3) (2000), 321-343.12. Dinakar Ramakrishnan, Robert J. Valenza,
Fourier analysis on number fields. (English summary) Grad-uate Texts in Mathematics, 186, Springer-Verlag, New York, 1999.13. Veeravalli S. Varadarajan,
Geometry of quantum theory.
Second edition. Springer-Verlag, New York, 1985.
Department of Mathematical Sciences, Chalmers University of Technology, Gothenburg,Sweden
E-mail address ::