Markovianity and the Thompson Monoid F +
aa r X i v : . [ m a t h . OA ] S e p MARKOVIANITY AND THE THOMPSON MONOID F + CLAUS K ¨OSTLER, ARUNDHATHI KRISHNAN, AND STEPHEN J. WILLS
Abstract.
We introduce a new distributional invariance principle, called ‘partial spreadability’,which emerges from the representation theory of the Thompson monoid F + in the framework of W*-algebraic probability spaces. We show that partial spreadability implies Markovianity of noncommu-tative stationary processes (in the sense of B. K¨ummerer and as they are considered by U. Haagerupand M. Musat within the context of factorisable Markov maps). Conversely we show that a large classof noncommutative stationary Markov processes provides representations of the Thompson monoid F + . In the particular case of a classical probability space, our approach anticipates the availabilityof a de Finetti theorem for recurrent stationary Markov chains with values in a standard Borel space,generalizing an early result of P. Diaconis and D. Freedman. Introduction
Distributional symmetries and invariance principles provide deep structural results on stochasticprocesses. In the early 1930s de Finetti characterized an exchangeable infinite sequence of { , } -valued random variables as a mixture of i.i.d. random variables. This result was extended to randomvariables with values in a compact Hausdorff space by Hewett and Savage, and further to values in astandard Borel space by Aldous. Furthermore, Ryll-Nadzewski [RN57] established that the apparentlyweaker distributional symmetry of spreadability (also known as ‘contractibility’ in the literature) isequivalent to exchangeability for infinite sequences. A de Finetti theorem for Markov chains was finallyfound by Diaconis and Freedman in the framework of recurrent infinite sequences of I -valued randomvariables, where I is a countable set [DF80]. Recently it was realized by Evans, Gohm and K¨ostler thatactually not exchangeability but spreadability is the fundamental distributional symmetry from theviewpoint of algebraic topology and homological algebra [EGK17]. A new equivalent characterizationof spreadability was obtained in [EGK17, Theorem 1.2] which connects this distributional invarianceprinciple to the representation theory of the so-called partial shift monoid S + = h h , h , h , . . . | h k h ℓ = h ℓ +1 h k for 0 ≤ k ≤ ℓ < ∞i + . Roughly phrasing, S + algebraically encodes stochastic independence in classical probability such thatits related geometry can be transferred to the framework of noncommutative probability. As thismonoid is a quotient of the Thompson monoid F + = h g , g , g , . . . | g k g ℓ = g ℓ +1 g k for 0 ≤ k < ℓ < ∞i + , it is intriguing to ask if its representation theory is connected to a novel distributional invariance prin-ciple which characterizes a larger class of (noncommutative) random objects than those characterizedby spreadability.The goal of present paper is to introduce ’partial spreadability’ as a new distributional invarianceprinciple as it emerges from the representation theory of the Thompson monoid F + and to presentfirst pioneering results on the following surprising discovery: the monoid F + algebraically encodesMarkovianity in classical probability. Of course, the underlying approach is again such that it can betransferred to an operator algebraic framework of noncommutative probability.Our main results are obtained in an operator algebraic setting of noncommutative probability theory,where the attribute ‘noncommutative’ is understood in the sense of ‘not-necessarily-commutative’,and thus applies also to classical probability theory. To be more precise, we consider pairs ( M , ψ )as (noncommutative) probability spaces which consist of a von Neumann algebra M and a faithfulnormal state ψ on M . Such pairs are also known as W*-algebraic probability spaces in the literature. A noncommutative random variable ι from the probability space ( A , ϕ ) into the probability space( M , ψ ) is given by an injective *-homomorphism ι : A → M such that ψ ◦ ι = ϕ , and written as ι : ( A , ϕ ) → ( M , ψ ).To improve the accessibility of our main results, we sketch next how one arrives at such noncommu-tative notions of probability spaces and random variables when starting from a traditional probabilisticsetting.Let (Ω , Σ , µ ) be a standard probability space. Then L := L ∞ (Ω , Σ , µ ), the Lebesgue space ofessentially bounded C -valued measurable functions, is a commutative von Neumann algebra and, with f ∈ L , the Lebesgue integral tr µ ( f ) := R Ω f dµ is a faithful normal tracial state on L . In otherwords, tr µ ( f ) is the expectation of the essentially bounded C -valued random variable f ∈ L such thattr µ ( f ∗ f ) = tr µ ( | f | ) = 0 implies f = 0 (in the Lebesgue sense). The pair ( L , tr µ ) is the standardexample for a noncommutative probability space coming from classical probability theory. Conversely,a noncommutative probability space ( M , ψ ) with a commutative von Neumann algebra M can be seento be isomorphic to this standard example, provided M has a separable predual.Now let (Ω , Σ ) be a standard Borel space and consider the (Ω , Σ )-valued random variable ξ on (Ω , Σ , µ ). Denote by µ := µ ◦ ξ − the pushforward measure of µ and by tr µ := R Ω · dµ the induced (tracial) state on L := L ∞ (Ω , Σ , µ ). Then ι ( f ) := f ◦ ξ defines an injective *-homomorphism from L = L ∞ (Ω , Σ , µ ) into L = L ∞ (Ω , Σ , µ ) such that tr µ ◦ ι = tr µ . Altogetherwe have arrived at the algebraization of the random variable ξ to the noncommutative random variable ι : ( L , tr µ ) → ( L , tr µ ). Conversely, if ι : ( A , ϕ ) → ( M , ψ ) is an injective *-homomorphism, where M (and thus A ) is a commutative von Neumann algebra with separable predual, then there exist twostandard probability spaces (Ω , Σ , µ ) and (Ω , Σ , µ ), and an (Ω , Σ )-valued random variable ξ on(Ω , Σ , µ ) with µ ◦ ξ − = µ such that the noncommutative random variables ι and ι are the same, upto isomorphisms between the involved standard probability spaces.This algebraization procedure, roughly phrasing, puts the emphasis on the σ -algebra generated by arandom variable and less on the random variable itself. Effectively, an (unbounded) random variable ξ may be replaced by an (essentially bounded) random variable f ◦ ξ = ι ( f ), as long as f ∈ L is chosensuch that ξ and f ◦ ξ generate the same σ -algebra. Of course, this observation for a single randomvariable f ◦ ξ extends immediately to multivariate settings, as the family of functions { f i } i ∈ I ⊂ L yields the family of (bounded) random variables { ι ( f i ) = f i ◦ ξ ) } i ∈ I .After this expository excursion to an algebraic reformulation of classical probability, we return nowto the general noncommutative setting. For the sake of clarity of our new approach, let us first recallthe following definition of spreadability as a distributional invariance principle which was identified in[K¨o10, EGK17] to be equivalent to its more traditional formulation. Definition 1.0.1 ([K¨o10, EGK17]) . A sequence of random variables ι ≡ ( ι n ) n ≥ : ( A , ϕ ) → ( M , ψ )is spreadable if there exists a representation ̺ : S + → End( M , ψ ) such that the following localizationand stationarity properties are satisfied: ι = ρ ( h n ) ι for all n ≥
1; (L) ι n = ρ ( h n ) ι if n ≥
0. (S)More generally, ι is said to be spreadable if there exists a spreadable sequence ˜ ι such that ι distr = ˜ ι .Replacing the role of the partial shift monoid S + by the Thompson monoid F + , we are now inthe position to introduce a natural generalization of spreadability as a new distributional invarianceprinciple. Definition 1.0.2.
A sequence of random variables ι ≡ ( ι n ) n ≥ : ( A , ϕ ) → ( M , ψ ) is partiallyspreadable if there exists a representation ρ : F + → End( M , ψ ) such that the following localizationand stationarity properties are satisfied: ι = ρ ( g n ) ι for all n ≥
1; (L) ι n = ρ ( g n ) ι if n ≥
0. (S)
ARKOVIANITY AND THE THOMPSON MONOID F + More generally, ι is said to be partially spreadable if there exists a partially spreadable sequence ˜ ι suchthat ι distr = ˜ ι .Clearly spreadability implies partial spreadability. Of course, the crucial question is if partialspreadability allows to develop similar results of de Finetti type as it is the case for spreadability,especially in the general framework of noncommutative probability. As detailed in [K¨o10], conditionalindependence in classical probability is generalized to a geometric notion which we call here ‘conditionalCS-independence’ and which is intimately related to Popa’s notion of commutating squares in subfactortheory. This geometric viewpoint on a very general notion of noncommutative independence is justifiedby the following noncommutative extended de Finetti theorem. Theorem 1.0.3 ([K¨o10]) . Let ι ≡ ( ι n ) n ≥ : ( A , ϕ ) → ( M , ψ ) be a sequence of (identically distributed)random variables and consider the following conditions:(a) ι is spreadable;(b) ι is stationary and conditionally CS-independent;(c) ι is identically distributed and conditionally CS-independent.Then one has the following implications: (a) = ⇒ (b) = ⇒ (c) . Moreover there are counterexamples to all converse implications.
We refer the interested reader to the introduction of [K¨o10] to learn more on why one shouldnot expect an equivalence of these three statements in the general framework of noncommutativeprobability, in contrast to the situation of classical probability or free probability.Replacing spreadability by partial spreadability we have succeeded to establish the following mainresult of de Finetti type which may be regarded to constitute a noncommutative generalization ofDiaconis and Freedman’s classical result for recurrent Markov chains with an at most countable statespace [DF80].
Theorem 1.0.4.
Let ι ≡ ( ι n ) n ≥ : ( A , ϕ ) → ( M , ψ ) be a sequence of (identically distributed) randomvariables and consider the following conditions:(a) ι is partially spreadable;(b) ι is stationary and conditionally Markovian;(c) ι is identically distributed and conditionally Markovian.Then one has the following implications: (a) = ⇒ (b) = ⇒ (c) . Clearly, the converse implication from (c) to (b) fails for the obvious reason that an identicallydistributed sequence may not be stationary. Presently we conjecture that the conditions (a) and(b) are equivalent in the framework of classical probability (where the von Neumann algebra M isisomorphic to L ). This converse implication is established in Theorem 4.2.6 and shows that a stationaryMarkov sequence is partially spreadable in present algebraic framework of classical probability. Thisextends of course to mixtures of such stationary Markov sequences, a result which we have omittedin the present version of this paper. Let us remark here that one should not expect that (b) implies(a) in the general noncommutative setting, for similar reasons of the corresponding failure in Theorem1.0.3.Of course, the elephant in the room created by the present formulation of Theorem 1.0.4 is theunderlying notion of ‘conditional Markovianity’ which we have so far left unspecified, and which iscasted in Definition 2.5.7. A further specification of ‘conditional Markovianity’ is beyond the scope ofpresent paper as it requires additional fixed point characterization results, similar to those as they areavailable in the context of spreadability or, in more general terms, as needed for concrete versions ofthe Krein-Milman theorem in the context of distributional symmetries and invariance principles.We are left to outline the content of this paper. C. K ¨OSTLER, A. KRISHNAN, AND S. J. WILLS
Section 2 introduces definitions, notion and some background results on the Thompson monoid F + (Subsection 2.1), the partial shift monoid S + (Subsection 2.2), noncommutative probability spacesand Markov maps (Subsection 2.3), noncommutative random variables and distributional invarianceprinciples (Subsection 2.4), noncommutative independence and Markovianity (Subsection 2.5), andfinally noncommutative stationary processes (Subsection 2.6).Section 3 investigates representations of the Thompson monoid F + in the endomorphisms of non-commutative probability spaces. Subsection 3.1 introduces the generating property of representationsof F + in Definition 3.1.1. It is shown in Subsection 3.2 that certain fixed point algebras of the rep-resented generators of F + provide triangular towers of commuting squares. Markovianity is obtainedfrom this as a particular property, see Corollary 3.2.5. A main result of Subsection 3.3 is Theorem 3.3.3which establishes that certain noncommutative stationary processes with partial spreadability have aMarkovian filtration. Finally, the proof of the main result, Theorem 1.0.4, is completed in Subsection3.4.Section 4 is about the certain constructions of representations of the Thompson monoid F + . Herewe focus on tensor product constructions in Subsection 4.1. Constructions in the setting of classicalprobability are the subject of Subsection 4.2. Finally, we turn our attention in Subsection 4.3 toconstructions in the general framework of operator algebras. We introduce monoids and investigatetheir representation theory, to adapt and refine K¨ummerer’s approach on noncommutative Markovprocesses as certain perturbations of noncommutative Bernoulli shifts.2. Preliminaries
Thompson group F and its monoid F + . The Thompson group F , originally introduced byRichard Thompson in 1965 as a certain group of piece-wise linear homeomorphisms on the interval[0 , F = h g , g , g , . . . | g k g ℓ = g ℓ +1 g k for 0 ≤ k < ℓ < ∞i . (2.1.1)As the defining relations of this presentation involve no inverse generators, one can associate to it themonoid F + = h g , g , g , . . . | g k g ℓ = g ℓ +1 g k for 0 ≤ k < ℓ < ∞i + , (2.1.2)henceforth referred to as the Thompson monoid F + . We remark that, alternatively, the generatorsof this monoid can be obtained as morphisms (in the inductive limit) of the category of finite binaryforests, see for example [Be04, Jo18].An element e = g ∈ F + has the unique normal form g = g a k k g a k − k − · · · g a g a , (2.1.3)where a , a , . . . , a k − , a k ∈ N with a k > k ∈ N (see [DT18, Proposition 2.2], for example). Definition 2.1.1.
Let m, n ∈ N with m ≤ n be fixed. The ( m, n ) -partial shift sh m,n is the endomor-phism on F + defined by sh m,n ( g k ) = ( g m if k = 0 g n + k if k ≥ . We remark that the map sh m,n preserves all defining relations of F + and is thus well-defined as anendomorphism. Lemma 2.1.2.
The endomorphisms sh m,n on F + are injective for all m, n ∈ N . Proof.
Let g ∈ F + have a the (unique) normal form as stated in (2.1.3). Thensh m,n ( g ) = g a k n + k g a k − n + k − · · · g a n +1 g a m is again in normal form. The injectivity of the map sh m,n is concluded from the uniqueness of thenormal form. (cid:3) ARKOVIANITY AND THE THOMPSON MONOID F + The partial shifts monoid S + . Stipulating additional relations to those for the generators in(2.1.2), one obtains the monoid S + = h h , h , h , . . . | h k h ℓ = h ℓ +1 h k for 0 ≤ k ≤ ℓ < ∞i + (2.2.1)as a quotient of the Thompson monoid F + . The monoid S + is not a submonoid of the group withinfinite presentation h h , h , h , . . . | h k h ℓ = h ℓ +1 h k for 0 ≤ k ≤ ℓ < ∞i , as the latter is isomorphicto the additive group Z . Actually the generators h k of the monoid S + satisfy relations as they arefamiliar for coface maps in simplicial cohomology. As discussed in [EGK17], these generators ariseas morphisms in the direct limit of the semi-cosimplicial category ∆ S . This becomes evident whenconsidering the following representation of S + which in particular motivates to address S + as partialshifts monoid and its generators g k as partial shifts . Lemma 2.2.1 (Partial shifts) . The maps ( θ k ) k ≥ : N → N , defined by θ k ( n ) = ( n + 1 if k ≤ n,n if k > n, satisfy the relations θ k θ ℓ = θ ℓ +1 θ k for ≤ k ≤ ℓ < ∞ . Noncommutative probability spaces and Markov maps.
Throughout a (noncommutative)probability space ( M , ϕ ) consists of a von Neumann algebra M and a faithful normal state ϕ on M . If M is abelian and acts on a separable Hilbert space, then ( M , ψ ) is isomorphic to (cid:0) L ∞ (Ω , Σ , µ ) , R X · dµ (cid:1) for some standard probability space (Ω , Σ , µ ). A Markov map T : ( M , ϕ ) → ( N , ψ ) between twoprobability spaces is a linear map from M to N satisfying:(i) T is completely positive,(ii) T is unital,(iii) ψ ◦ T = ϕ ,(iv) T ◦ σ ϕt = σ ψt ◦ T , for all t ∈ R .Here σ ϕ and σ ψ denote the modular automorphism groups of ( M , ϕ ) and ( N , ψ ), respectively. If( M , ϕ ) = ( N , ψ ), we say that T is a Markov map on ( M , ϕ ). Conditions (i) to (iii) imply that aMarkov map is automatically normal. The condition (iv) is equivalent to that a (unique) Markov map T ∗ : ( N , ψ ) → ( M , ϕ ) exists such that ϕ (cid:0) T ∗ ( y ) x (cid:1) = ψ (cid:0) y T ( x ) (cid:1) ( x ∈ M , y ∈ N ) . The Markov map T ∗ is called the adjoint of T and T is called self-adjoint if T = T ∗ . We note thatcondition (iv) is automatically satisfied whenever ϕ and ψ are tracial, in particular for abelian vonNeumann algebras M and N . Remark 2.3.1.
Usually a Markov operator T on a von Neumann algebra M is defined to be a unitalnormal completely positive linear map from M to itself. Thus the above notion of a Markov map T on ( M , ϕ ) is more restrictive as the existence of faithful normal state ϕ with the stationary condition ϕ ◦ T = ϕ is required. In particular, as recurrence for Markov operators in noncommutative probabilityis defined via support properties of stationary normal states (see [GK12]), every non-zero orthogonalprojection p ∈ M is positive recurrent w.r.t. a Markov map T on ( M , ϕ ), as the faithful state ϕ hasthe support projection M ∈ M .2.4. Noncommutative random variables and distributional invariance principles.
Let ( A , ϕ )and ( M , ψ ) be two probability spaces. A (noncommutative) random variable ι is an injective *-homomorphism ι : A → M satisfying two additional properties:(i) ϕ = ψ ◦ ι ;(ii) ι ( A ) is ψ -conditioned.A random variable will also be addressed as the mapping ι : ( A , ϕ ) → ( M , ψ ). If ˜ ι : ( A , ϕ ) → ( ˜ M , ˜ ψ )is another random variable, then ι and ˜ ι have the same moment sequence and thus are identicallydistributed. Given the (identically distributed) sequence of random variables ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( M , ψ ) , C. K ¨OSTLER, A. KRISHNAN, AND S. J. WILLS the family A • ≡ ( A I ) I ⊂ N , with von Neumann subalgebras A I := W i ∈ I ι i ( A ), is called the canonicalfiltration (generated by ι ) . The sequence ι is said to be minimal if A N = M . A sequence ι can alwaysbe turned into a minimal sequence by restriction. If˜ ι ≡ (˜ ι n ) n ∈ N : ( A , ϕ ) → ( ˜ M , ˜ ψ )is another sequence of random variables, then ι and ˜ ι are said to have the same distribution , in symbols( ι , ι , ι , . . . ) distr = (˜ ι , ˜ ι , ˜ ι , . . . ) or just ι distr = ˜ ι , if ψ (cid:0) ι k ( a ) ι k ( a ) · · · ι k n ( a n ) (cid:1) = ˜ ψ (cid:0) ˜ ι k ( a )˜ ι k ( a ) · · · ˜ ι k n ( a n ) (cid:1) for all k , k , . . . , k n ∈ N , a , a , . . . , a n ∈ A and n ∈ N . Definition 2.4.1.
The sequence of random variables ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( M , ψ ) is said to be(i) stationary if ( ι , ι , ι , . . . ) distr = ( ι n , ι n +1 , ι n +2 , . . . ) for all n ∈ N ;(ii) spreadable if ( ι , ι , ι , . . . ) distr = ( ι n , ι n , ι n , . . . ) for any increasing subsequence ( n , n , n , . . . )of (0 , , , . . . ).(iii) exchangeable if ( ι , ι , ι , . . . ) distr = ( ι σ (0) , ι σ (1) , ι σ (2) , . . . ) for all permutations σ ∈ S ∞ .Here S ∞ denotes the group of all finite permutations on the set N such that the Coxeter generator σ k ∈ S ∞ is the transposition ( k − , k ).It is elementary to verify that one has the following hierarchy:exchangeability = ⇒ spreadability = ⇒ stationarity . These three distributional invariance principles can be equivalently reformulated.
Proposition 2.4.2.
Suppose ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( M , ψ ) is a minimal sequence of random vari-ables.(i) The sequence ι is stationary if and only if there exists α ∈ End( M , ψ ) such that ι n = α n ι for all n ∈ N .(ii) The sequence ι is spreadable if and only if there exists a representation ̺ : S + → End( M , ψ ) such that ι = ̺ ( h k ) ι for all k ≥ and ι n = ̺ ( h n ) ι for all n ∈ N .(iii) The sequence ι is exchangeable if and only if there exists exists a representation ρ perm : S ∞ → Aut( M , ψ ) such that ι = ρ perm ( σ k ) ι for k ≥ and ι n = ρ perm ( σ n σ n − · · · σ ) ι for all n ∈ N .Proof. For (i) see [K¨o10]. For (ii) see [K¨o10, EGK17]. For (iii) see [GK09a]. (cid:3)
The equivalent formulation of spreadability in (ii) and the simple observation that S + is a quotientof the Thompson monoid F + catalyzed our introduction of partial spreadability in Definition 1.0.2 asa novel distributional invariance principle. This implies the extended hierarchy:exchangeability = ⇒ spreadability = ⇒ partial spreadability = ⇒ stationarity . Noncommutative independence and Markovianity.
Out of K¨ummerer’s investigations onthe structure of noncommutative Markov dilations emerged that Popa’s geometric notion of commutingsquares provides a rich framework for noncommutative independence. This notion manifests itselfalso in the noncommutative extended de Finetti theorem, Theorem 1.0.3. After having introducedcommuting squares of von Neumann algebras and some of their properties, as they are well-knownin subfactor theory, we reinterpret these geometric objects from the viewpoint of noncommutativeprobability theory, to define (conditional) CS-independence.
Proposition 2.5.1.
Let M , M , M be ϕ -conditioned von Neumann subalgebras of the probabilityspace ( M , ϕ ) such that M ⊂ ( M ∩ M ) . Then the following are equivalent:(i) E M ( xy ) = E M ( x ) E M ( y ) for all x ∈ M and y ∈ M ;(ii) E M E M = E M ;(iii) E M ( M ) = M ; ARKOVIANITY AND THE THOMPSON MONOID F + (iv) E M E M = E M E M and M ∩ M = M .In particular, it holds that M = M ∩ M if one and thus all of these four assertions are satisfied.Proof. The tracial case for ψ is proved in [GHJ89, Prop. 4.2.1.].The non-tracial case follows from this,after some minor modifications of the arguments therein. (cid:3) Definition 2.5.2.
The inclusions M ⊂ M∪ ∪M ⊂ M as given in Proposition 2.5.1 are said to form a commuting square (of von Neumann algebras) if one(and thus all) of the equivalent conditions (i) to (iv) are satisfied in Proposition 2.5.1. Notation 2.5.3.
We write
I < J for two subsets
I, J ⊂ N if i < j for all i ∈ I and j ∈ J . Thecardinality of I is denoted by | I | . For N ∈ N , we denote by I + N the shifted set { i + N | i ∈ I } .Finally, I ( N ) denote set of all ‘intervals’ of N , i.e. sets of the form [ m, n ] := { m, m + 1 , . . . , n } or[ m, ∞ ) = { m, m + 1 , . . . } for 0 ≤ m ≤ n < ∞ . Definition 2.5.4.
Let ( M , ψ ) be a probability space with three ψ -conditioned von Neumann subal-gebras M , M and M . Then M and M are said to be CS-independent over M or conditionallyCS-independent if the inclusions M ∨ M ⊂ M∪ ∪M ⊂ M ∨ M form a commuting square.The inclusion M ⊂ ( M ∩ M ) is not assumed in this definition, and its failure occurs frequentlyin the context of distributional invariance principles, see for example [K¨o10, Example 4.6]. Definition 2.5.5.
Let N be a von Neumann subalgebra of ( M , ψ ). A family of von Neumannsubalgebras {A n } n ∈ N of ( M , ψ ) is called(i) order CS-independent over N if W i ∈ I A i and W j ∈ J A j are CS-independent over N for any I, J ⊂ N with I < J or J < I ;(ii) full CS-independent over N if A I and A J are CS-independent over N for any I, J ⊂ N with I ∩ J = ∅ .The sequence of random variables ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( M , ψ ) with canonical filtration ( A I ) I ⊂ N is called(i) order CS-independent over N if A I and A J are CS-independent over N for any I, J ∈ I ( N )with I ∩ J = ∅ ;(ii) full CS-independent over N if A I and A J are CS-independent over N for any I, J ⊂ N with I ∩ J = ∅ .Here N is some von Neumann subalgebra of ( M , ψ ).A family of ψ -conditioned von Neumann subalgebras M • ≡ ( M I ) I ∈I ( N ) of the probability space( M , ψ ) is called a filtration (of ( M , ψ )) if I ⊂ J = ⇒ M I ⊂ M J . (Isotony)The isotony property ensures that inclusions are valid as they are assumed for commuting squares. Tobe more precise, it holds that M I ⊂ M∪ ∪M K ⊂ M J for I, J, K ∈ I ( N ) with K ⊂ ( I ∩ J ). C. K ¨OSTLER, A. KRISHNAN, AND S. J. WILLS
Definition 2.5.6.
A filtration M • ≡ ( M I ) I ∈I ( N ) of the probability space ( M , ϕ ) is said to be Markovian if E M [0 ,n ] E M [ n, ∞ ) = E M [ n,n ] for all n ≥
0. (M)Here E M I denotes the ϕ -preserving normal conditional expectation from M onto M I .The isotony property ensures the inclusion M [ n,n ] ⊂ ( M [0 ,n ] ∩ M [ n, ∞ ) ). Thus the property (M) isequivalent to that the inclusions M [0 ,n ] ⊂ M∪ ∪M [ n,n ] ⊂ M [ n, ∞ ) form a commuting square for each n ∈ N . So Markovianity has many equivalent formulations, seeProposition 2.5.1. Definition 2.5.7.
A sequence of random variables ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( M , ϕ ) with canonicalfiltration A • ≡ (cid:0) A I := W n ∈ I ι n ( A ) (cid:1) I ∈I ( N ) is said to be M • - Markovian or conditionally Markovian if there exists a Markovian filtration M • ≡ ( M I ) I ∈I ( N ) of the probability space ( M , ϕ ) such that A I ⊂ M I for all I ∈ I ( N ) (adaptedness). If the canonical filtration A • is Markovian then thesequence ι is also just called a Markov sequence .Similar to the necessity of conditional independence for distributional invariance principles, it is toonarrow to formulate Markovianity only with respect to the canonical filtration of a sequence of randomvariables. Roughly phrasing, a crucial feature of ‘mixtures of Markov chains’ is that their canonicalfiltration is not Markovian.
Remark 2.5.8.
This wide notion of ‘conditional Markovianity’ in Definition 2.5.7 should be under-stood with some care, as it permits trivial statements. For example, given the probability space ( M , ψ ),the trivial filtration M • ≡ ( M I = M ) I ∈I ( N ) is Markovian and any sequence of random variables isadapted to it. Consequently any sequence of random variables is conditionally Markovian with respectto this trivial Markovian filtration.2.6. Noncommutative stationary processes.
We recall some notions and conventions for unilat-eral noncommutative stationary processes, as they have emerged out of K¨ummerer’s investigations onW*-algebraic dilations [K¨u85], and as they are underlying the approach in [K¨o10, GK09a].
Definition 2.6.1. A (unilateral) stationary process ( M , ψ, α, M ) consists of a probability space( M , ψ ), a ψ -conditioned subalgebra M ⊂ M , and an endomorphism α ∈ End( M , ψ ). The sequence( ι n ) n ≥ : ( M , ψ ) → ( M , ψ ) , ι n := α n | M , is called the sequence of random variables associated to ( M , ψ, α, M ). Definition 2.6.2.
A stationary process ( M , ψ, α, M ) is said to have property A if its associatedsequence of random variables ( ι n ) n ≥ : ( M , ψ ) → ( M , ψ ) has property A. Here ψ denotes therestriction of ψ from M to M . For example, ( M , ψ, α, M ) is minimal if ( ι n ) n ≥ is minimal. Definition 2.6.3.
The (not necessarily minimal) stationary process (cid:0) M , ψ, α, A ) is called a (unilat-eral noncommutative) stationary Markov process if its canonical filtration ( A I ) I ∈I ( N ) is Markovian.If this process is minimal, then the endomorphism α is also called a Markov shift with generator A . Definition 2.6.4.
The minimal stationary process is (cid:0) M , ψ, β, B ) is called a (unilateral noncommu-tative) (full/ordered) Bernoulli shift with generator M if M β ⊂ B and { β n ( B ) } n ∈ N is full/orderCS-independent over M β . ARKOVIANITY AND THE THOMPSON MONOID F + Markovianity from representations of the Thompson monoid F + The noncommutative de Finetti theorem, Theorem 1.0.3, rests on that representations of the par-tial shift monoid S + on noncommutative probability spaces provide rich structures of commutingsquares, in particular as they are underlying the notion of noncommutative Bernoulli shifts in Defi-nition 2.6.4. Here we investigate commmuting square structures as they emerge from representationsof the Thompson monoid F + on noncommutative probability spaces. Our investigations reveal thatcertain commuting squares, as available in triangular towers of inclusions, already encode Markovianity.Let us fix some notation, as it will be used throughout this section. We assume that the proba-bility space ( M , ψ ) is equipped with the representation ρ : F + → End( M , ψ ). For brevity of notion,especially in proofs, the represented generators of F + are also denoted by α n := ρ ( g n ) ∈ End( M , ψ ) , with fixed point algebras by M α n := { x ∈ M | α n ( x ) = x } , for 0 ≤ n < ∞ . Furthermore theintersections of fixed point algebras M n := \ k ≥ n +1 M α k give the tower of von Neumann subalgebras M ⊂ M ⊂ M ⊂ . . . ⊂ M ∞ := _ n ≥ M n ⊂ M . From the viewpoint of (noncommutative probability) theory, this tower provides a filtration of thenoncommutative probability space ( M , ψ ). In particular, we will see in Subsection 3.2 that the inclu-sions M m ⊂ M∪ ∪ α m ( M ) ⊂ α m ( M ∞ )form commuting squares which encode Markovianity. Consequently the canonical filtration of a sta-tionary process ( M , ψ, α , A ) will be seen to be adapted to a Markovian filtration whenever the ψ -conditioned von Neumann subalgebra A is well-localized, to be more precise: contained in theintersection of fixed point algebras M . It is worthwhile to emphasize that, depending on the choiceof the generator A , the canonical filtration of this stationary process may not be Markovian. Subsec-tion 3.3 investigates in detail conditions under which the canonical filtration of a stationary process( M , ψ, α , A ) is Markovian. Finally, Subsection 3.4 provides the proof of Theorem 1.0.4, a noncom-mutative de Finetti theorem as appropriate for noncommutative stationary Markov processes.3.1. Representations with a generating property.
An immediate consequence of the relationsbetween generators of the Thompson monoid F + is the adaptedness of the endomorphism α to thetower of (intersected) fixed point algebras: α ( M n ) ⊂ M n +1 for all n ∈ N . Thus, generalizing terminology from classical probability, the random variables ι := Id | M : M → M ⊂ M ι := α | M : M → M ⊂ M ι := α | M : M → M ⊂ M· · · ι n := α n | M : M → M n ⊂ M are adapted to the filtration M ⊂ M ⊂ M ⊂ . . . and α is the time evolution of the stationaryprocess ( M , ψ, α , M ). We refer the reader to [Go04, Chapter 3] for more information on the generalphilosophy of adapted endomorphims and to [GK09a, Appendix A] or [EGK17] on how adaptednessis of relevance within the context of distributional symmetries and invariance principles. Clearly, at most the von Neumann subalgebra M ∞ can be generated by this sequence of randomvariables ( ι n ) n ≥ . An immediate question is if a representation of the Thompson monoid F + restrictsto the von Neumann subalgebra M ∞ . Definition 3.1.1.
The representation ρ : F + → End( M , ψ ) is said to have the generating property if M ∞ = M .As shown in Proposition 3.1.5 below, this generating property entails that each intersected fixedpoint algebra M n = T k>n M α k equals the single fixed point algebra M α n +1 . Thus the generatingproperty tremendously simplifies the form of the tower M ⊂ M ⊂ . . . , and our next result showsthat this can always be achieved by restriction. Proposition 3.1.2.
The representation ρ : F + → End( M , ψ ) restricts to the generating representation ρ gen : F + → End( M ∞ , ψ ∞ ) such that α n ( M ∞ ) ⊂ M ∞ and E M ∞ E M αn = E M αn E M ∞ for all n ∈ N .Here ψ ∞ denotes the restriction of the state ψ to M ∞ .Proof. We show that α i ( M n ) ⊂ M n +1 for all i, n ≥
0. Let x ∈ M n . If i ≥ n + 1 then α i ( x ) = x isimmediate from the definition of M n . If i < n + 1 then, using the relations for the generators of theThompson monoid, α i ( x ) = α i α k +1 ( x ) = α k +2 α i ( x ) for any k ≥ n , thus α i ( x ) ∈ M n +1 . Consequently α i maps S n ≥ M n into itself for any i ∈ N . Now a standard approximation argument shows that M ∞ is invariant under α i for any i ∈ N . Consequently the representation ρ restricts to M ∞ and, ofcourse, this restriction ρ gen has the generating property.Since M ∞ is globally invariant under the modular automorphism group of ( M , ψ ), there existsthe (unique) ψ -preserving normal conditional expectation E M ∞ from M onto M ∞ . In particular, ρ gen ( g i ) = α i ↾ M ∞ commutes with the modular automorphism group of ( M ∞ , ψ ∞ ) which ensures ρ gen ( g i ) ∈ End( M ∞ , ψ ∞ ). Finally that E M ∞ and E M αi commute is concluded from E M ∞ α i E M ∞ = α i E M ∞ , which implies E M αi E M ∞ = E M ∞ E M αi by routine arguments, and an application of the mean ergodictheorem, E M αi = lim N →∞ N N − X n =0 α ni , where the limit is taken in the pointwise strong operator topology. (cid:3) Lemma 3.1.3.
With the notations as above, M k = M α k +1 ∩ M ∞ for all k ∈ N .Proof. Let Q n denote the ψ -preserving normal conditional expectation from M onto M α n . By thedefinition of M k and M ∞ , it is clear that M k ⊆ M α k +1 ∩ M ∞ . In order to show the reverse inclusion,it suffices to show that Q n Q k ↾ M ∞ = Q k ↾ M ∞ , ≤ k < n < ∞ . We claim that, for 0 < k < n , Q n Q k ↾ M ∞ = Q k ↾ M ∞ ⇐⇒ Q k Q n Q k ↾ M ∞ = Q k ↾ M ∞ . Indeed this equivalence is immediate from ψ (cid:0) ( Q n Q k − Q k )( y ∗ )( Q n Q k − Q k )( x ) (cid:1) = ψ (cid:0) y ∗ ( Q k Q n − Q k )( Q n Q k − Q k )( x ) (cid:1) = ψ (cid:0) y ∗ ( Q k − Q k Q n Q k )( x ) (cid:1) for all x, y ∈ M ∞ . We are left to prove Q k Q n Q k ↾ M ∞ = Q k ↾ M ∞ for k < n . For this purpose weexpress the conditional expectations Q k and Q n as mean ergodic limits in the pointwise strong operator ARKOVIANITY AND THE THOMPSON MONOID F + topology and calculate Q k Q n Q k ↾ M ∞ = lim M →∞ lim N →∞ M N M X i =1 N X j =1 α ik α jn Q k ↾ M ∞ = lim M →∞ lim N →∞ M N M X i =1 N X j =1 α jn + i α ik Q k ↾ M ∞ = lim M →∞ lim N →∞ M N M X i =1 N X j =1 α jn + i Q k ↾ M ∞ = lim M →∞ M M X i =1 Q n + i Q k ↾ M ∞ = Q k . Here the last equality follows because for x ∈ S n ≥ M α n = M ∞ , also Q k x ∈ M ∞ and so it holdsthat Q n + i Q k ( x ) = Q k ( x ) for i sufficiently large, thuslim M →∞ M M X i =1 Q n + i = Idin the pointwise strong operator topology. (cid:3) Corollary 3.1.4.
The following set of inclusions forms a commuting square for every n ∈ N : M α n +1 ⊂ M∪ ∪M n ⊂ M ∞ Proof.
Let Q n and E M ∞ be the ψ -preserving normal conditional expectation from M onto M α n and M ∞ respectively for n ∈ N . For n ∈ N , by Proposition 3.1.2, Q n +1 E M ∞ = E M ∞ Q n +1 and byLemma 3.1.3, M n = M α n +1 ∩ M ∞ . By (iv) of Proposition 2.5.1, we get a commuting square. (cid:3) Proposition 3.1.5.
If the representation ρ : F + → End( M , ψ ) has the generating property then thefollowing equality holds for all n ∈ N : M n = M α n +1 . In other words, one has the tower of fixed point algebras M ρ ( F + ) ⊂ M ρ ( g ) ⊂ M ρ ( g ) ⊂ M ρ ( g ) ⊂ . . . ⊂ M = _ n ≥ M ρ ( g n ) . Proof.
If the representation ρ is generating, then M ∞ = M . Hence M n = M α n +1 for all n ∈ N as aconsequence of Lemma 3.1.3. (cid:3) Remark 3.1.6.
Suppose that the representation ρ : F + → End( M , ψ ) satisfies the additional relations ρ ( g n ) ρ ( g n ) = ρ ( g n +1 ) ρ ( g n ) for all n ∈ N , as it is the case for representations of the partial shifts monoid S + . Then the inclusions M ρ ( g n ) ⊂ M ρ ( g n +1 ) , and consequently M n = M ρ ( g n +1 ) , are immediatewithout stipulating the generating property of the representation, since x = ρ ( g n )( x ) implies x = ρ ( g n )( x ) = ρ ( g n +1 ) ρ ( g n )( x ) = ρ ( g n +1 )( x ) for all x ∈ M and n ∈ N .3.2. Commuting squares and Markovianity from shifted fixed point algebras.
The followingintertwining properties will be crucial for obtaining Markov filtrations from representations of theThompson monoid F + . Proposition 3.2.1.
Suppose ρ : F + → End( M , ψ ) is a (not necessarily generating) representation of F + . Then with α n = ρ ( g n ) , the following equality holds: α k Q n = Q n +1 α k for all ≤ k < n < ∞ . Here Q n denotes the ψ -preserving normal conditional expectation from M onto the fixed point algebra M α n of the represented generator α n ∈ End( M , ψ ) . Proof.
An application of the mean ergodic theorem and the relations between the generators of theThompson monoid F + yield that, for k < n , α k Q n = lim N →∞ N N − X i =0 α k α in = lim N →∞ N N − X i =0 α in +1 α k = Q n +1 α k . Here the limits are taken in the pointwise strong operator topology. (cid:3)
Theorem 3.2.2.
Suppose ρ : F + → End( M , ψ ) is a generating representation with α k := ρ ( g k ) forall k ∈ N . Then each cell in the following triangular tower is a commuting square: M ⊂ M ⊂ M ⊂ M ⊂ M ⊂ · · · ⊂ M ∞ = M∪ ∪ ∪ ∪ ∪ α (cid:0) M (cid:1) ⊂ α (cid:0) M (cid:1) ⊂ α (cid:0) M (cid:1) ⊂ α (cid:0) M (cid:1) ⊂ · · · ⊂ α (cid:0) M ∞ (cid:1) ∪ ∪ ∪ ∪ α (cid:0) M (cid:1) ⊂ α (cid:0) M (cid:1) ⊂ α (cid:0) M (cid:1) ⊂ · · · ⊂ α (cid:0) M ∞ (cid:1) ∪ ∪ ∪ ... ... ...In particular, M n +1 ∩ α (cid:0) M n +1 (cid:1) = α (cid:0) M n (cid:1) for all n ≥ .Proof. Let 0 ≤ m < n < ∞ and k ≥
1. We verify first all inclusions as they appear in the diagram M m + k ⊂ M n + k ∪ ∪ α k ( M m ) ⊂ α k ( M n ) . (3.2.1)Indeed, the definition of M n ensures the claimed horizontal inclusions in this diagram. The verticalinclusions in the diagram follow from the intertwining properties α k Q n +1 = Q n +1+ k α k (see Proposi-tion 3.2.1). For n = ∞ , all inclusions are easily concluded by routine approximation arguments.We show next that above diagram is a commuting square. Indeed as ρ is generating, M n = M α n +1 , ∀ n ∈ N . Hence for any x ∈ M n , E M m + k α k ( x ) = Q m + k +1 α k ( x ) = α k Q m +1 ( x ) = α k E M m ( x ) . This ensures that E M m + k ( α k ( M n )) = α k ( M m ). Thus the above inclusions form a commuting squareby Proposition 2.5.1 and, in particular, it holds that α k ( M m ) = M m + k ∩ α k ( M n ).Finally, the commuting square properties of more general cells in the triangular tower of inclusionsare deduced from those in (3.2.1), since commuting square properties are preserved when acting withthe endomorphism α on all four corners of the diagram. (cid:3) Corollary 3.2.3.
Suppose ρ : F + → End( M , ψ ) is a generating representation with α k := ρ ( g k ) forall k ∈ N . Let ≤ m ≤ n < ∞ be fixed. Then each cell in the following triangular tower is acommuting square: M m ⊂ M n +1 ⊂ M n +2 ⊂ M n +3 ⊂ M n +4 ⊂ · · · ⊂ M ∞ = M∪ ∪ ∪ ∪ ∪ α m (cid:0) M m (cid:1) ⊂ α m (cid:0) M n +1 (cid:1) ⊂ α m (cid:0) M n +2 (cid:1) ⊂ α m (cid:0) M n +3 (cid:1) ⊂ · · · ⊂ α m (cid:0) M ∞ (cid:1) ∪ ∪ ∪ ∪ α m (cid:0) M m (cid:1) ⊂ α m (cid:0) M n +1 (cid:1) ⊂ α m (cid:0) M n +2 (cid:1) ⊂ · · · ⊂ α m (cid:0) M ∞ (cid:1) ∪ ∪ ∪ ... ... ...In particular, M n +1 ∩ α m (cid:0) M n +1 (cid:1) = α m (cid:0) M m (cid:1) and M n + k +1 ∩ α m (cid:0) M n + k +1 (cid:1) = α m (cid:0) M n + k (cid:1) for all k ≥ . ARKOVIANITY AND THE THOMPSON MONOID F + Proof.
Consider the representation ρ m,n := ρ ◦ sh m,n : F + → End( M , ψ ) where sh m,n denotes the( m, n )-partial shift as introduced in Definition 2.1.1. We observe that ρ m,n ( g ) = ρ ( g m ) and ρ m,n ( g k ) = ρ ( g n + k ) for all k ≥
1. In particular this ensures that ρ m,n inherits the generating property from therepresentation ρ . Thus Theorem 3.2.2 applies to ρ m,n and all claimed properties are immediate since M m = M ρ ( g m +1 ) = M ρ m,n ( g ) and M m + k = M ρ ( g m + k +1 ) = M ρ m,n ( g k ) for k ≥ (cid:3) The triangular tower of α -shifted fixed point algebras (as given in Theorem 3.2.2) can also beaddressed through a filtration indexed by ‘intervals’. This reveals that Markovianity (as introduced inDefinition 2.5.6) corresponds to specific commuting squares in the triangular tower. Corollary 3.2.4.
Suppose ρ : F + → End( M , ψ ) is a generating representation. The family of vonNeumann subalgebras M ρ • ≡ {M ρI } I ∈I ( N ) of ( M , ψ ) , with M ρ [0 ,n ] := M n , M ρ [ m,m + n ] := ρ ( g m )( M n ) , M ρ [ m, ∞ ) := ρ ( g m )( M ∞ ) , defines a Markovian filtration.Proof. First we check the isotony property to verify that this family of subalgebras forms a filtration.Suppose [ m, m + n ] ⊂ [ k, k + ℓ ], we will show that M ρ [ m,m + n ] ⊂ M ρ [ k,k + ℓ ] , that is, α m ( M n ) ⊂ α k ( M ℓ ).As [ m, m + n ] ⊂ [ k, k + ℓ ], we must have m ≥ k and n ≤ ℓ . Hence for x ∈ M n , we can write α m ( x ) = α k α m − k ( x ), so it suffices to show that α m − k ( x ) ∈ M ℓ . Let p ≥ ℓ + 1 ≥ ( m − k ) + n + 1, then α p α m − k ( x ) = α m − k α p − ( m − k ) ( x ) = α m − k ( x ) as p − ( m − k ) ≥ n + 1.Let P ρI denote the ψ -preserving normal conditional expectation from M onto M ρI . This filtrationis Markovian if P ρ [0 ,m ] P ρ [ m,n ] = P ρ [ m,m ] for 0 ≤ m ≤ n , which is implied by the definition of M ρI and thefollowing cell of inclusions that is a commuting square as a consequence of Theorem 3.2.2: M m ⊂ M∪ ∪ α m ( M ) ⊂ α m ( M ∞ ) . (cid:3) Corollary 3.2.5.
Suppose ρ : F + → End( M , ψ ) is a generating representation and consider the ( m, n ) -shifted representation ρ m,n := ρ ◦ sh m,n for some fixed ≤ m ≤ n < ∞ . Then the familyof von Neumann subalgebras M ρ m,n • ≡ { M ρ m,n I } I ∈I ( N ) of ( M , ψ ) , with M ρ m,n [0 ,ℓ ] := M m + ℓ , M ρ m,n [ k,k + ℓ ] := ρ ( g km )( M n + ℓ ) , M ρ m,n [ k, ∞ ) := ρ ( g km )( M ∞ ) , defines a Markovian filtration.Proof. The case m = n = 0 corresponds to Corollary 3.2.4. Its proof directly transfers to the generalcase 0 ≤ m ≤ n , after relabeling the involved objects and morphisms according to the ( m, n )-shiftedrepresentation. (cid:3) Commuting squares and Markovianity from (noncommutative) stationary processes.
Given the representation ρ : F + → End( M , ψ ), with represented generators α n := ρ ( g n ), for n ∈ N ,and intersected fixed point algebras M n := \ k ≥ n +1 M α k , let A ⊂ M be a von Neumann subalgebra of ( M , ψ ). Then ( M , ψ, α , A ) is a (unilateral non-commutative) stationary process with generating algebra A . Its canonical filtration is denoted by A • ≡ ( A I ) I ∈I ( N ) , where A I := _ i ∈ I α i ( A ) , and an ‘interval’ I ∈ I ( N ) is written as [ m, n ] := { i ∈ N | m ≤ i ≤ n } or [ m, ∞ ) := { i ∈ N | m ≤ i } .Furthermore P I will denote the ψ -preserving normal conditional expectation from M onto A I . Notethat the endomorphism α acts covariantly on the filtration, i.e. α ( A I ) = A I +1 for all I ∈ I ( N ),where I + 1 := { i + 1 | i ∈ I } . We record a simple, but important, observation obtained from the relations of F + on stationaryprocesses to which we will frequently appeal to. Recall that M ρ • denotes the Markov filtration fromCorollary 3.2.4. Proposition 3.3.1.
Let ( M , ψ, α , A ) be the (unilateral noncommutative) stationary process with A ⊂ M as above. Then it holds that A [0 ,n ] ⊂ M n for all n ∈ N . If ρ is generating, then thecanonical filtration A • is M ρ • -Markovian.Proof. As A ⊂ M , it holds that α n ( x ) = x for any x ∈ A and n ∈ N . Thus using the definingrelations of F + we get for 0 ≤ k ≤ n and n + 1 ≤ l , α l α k ( x ) = α k α l − k ( x ) = α k ( x ) . Hence A [0 ,n ] ⊂ M n = M ρ [0 ,n ] for all n ∈ N . As the endomorphism α acts covariantly, we concludethat A [ m,n ] ⊂ α m ( M n − m ) = M ρ [ m,n ] for 0 ≤ m ≤ n . Furthermore it holds that A [ m, ∞ ) = _ n ≥ m A [ m,n ] ⊂ _ n ≥ m M ρ [ m,n ] = M ρ [ m, ∞ ) . This establishes that the canconical filtration A • is adapted to the filtration M ρ • , which is Markovianas ρ is generating as considered in Corollary 3.2.4. (cid:3) We next observe that the generating property of the representation ρ can be concluded from theminimality of a stationary process. Proposition 3.3.2.
Suppose the representation ρ : F + → End( M , ψ ) and A ⊂ M are given. If thestationary process ( M , ψ, α , A ) is minimal, then ρ is generating.Proof. For the stationary process ( M , ψ, α , A ), recall that A [0 , ∞ ] = W i ∈ N α i ( A ) and minimalityimplies A [0 , ∞ ) = M . By Proposition 3.3.1, A [0 ,n ] ⊂ M n for all n ∈ N . Thus M = W n ≥ A [0 ,n ] ⊂ W n ≥ M n = M ∞ . We conclude from this that the representation ρ has the generating property,i.e. M ∞ = M . (cid:3) In the following results, it is not assumed that the stationary process is minimal or that the repre-sentation ρ is generating unless explicitly mentioned. Theorem 3.3.3.
Suppose ρ : F + → End( M , ψ ) is a representation. Let α n := ρ ( g n ) as before, and let A ⊂ M and A [0 , ∞ ) := W n ∈ N α n ( A ) be von Neumann subalgebras of ( M , ψ ) such that the inclusions M α ⊂ M∪ ∪A ⊂ A [0 , ∞ ) form a commuting square. Then the family of von Neumann subalgebras A • ≡ {A I } I ∈I ( N ) , with A I := _ i ∈ I α i ( A ) is a Markovian filtration and (cid:0) M , ψ, α , A (cid:1) is a stationary Markov process.Proof. Note that the commuting square condition implies Q P [0 , ∞ ) = P [0 , . From Proposition 3.3.1, A [0 ,n ] ⊂ M n ⊂ M α n +1 for all n ∈ N . ARKOVIANITY AND THE THOMPSON MONOID F + Hence we get P [0 ,n ] α n P [0 , ∞ ) = P [0 ,n ] Q n +1 α n P [0 , ∞ ) (since A [0 ,n ] ⊂ M α n +1 )= P [0 ,n ] α n Q P [0 , ∞ ) (by intertwining property)= P [0 ,n ] α n P [0 , P [0 , ∞ ) (by commuting square condition)= α n P [0 , P [0 , ∞ ) (as A [ n,n ] ⊂ A [0 ,n ] )= P [ n,n ] α n P [0 , P [0 , ∞ ) (since A [ n,n ] = α n ( A ))= P [ n,n ] α n Q P [0 , ∞ ) (by commuting square condition)= P [ n,n ] Q n +1 α n P [0 , ∞ ) (by intertwining property)= P [ n,n ] α n P [0 , ∞ ) (since A [ n,n ] ⊂ M α n +1 ) . Altogether we have shown that P [0 ,n ] P [ n, ∞ ) = P [ n,n ] , which is the required Markovianity for thefiltration ( A I ) I ∈I ( N ) . (cid:3) Corollary 3.3.4.
Suppose ρ : F + → End( M , ψ ) is a representation with α = ρ ( g ) . Then thequadruple (cid:0) M , ψ, α , M (cid:1) is a stationary Markov process.Proof. We know from Corollary 3.1.4 that M ⊆ M α and that the following is a commuting square: M α ⊂ M∪ ∪M ⊂ M ∞ . As M [0 ,n ] ⊂ M n for all n ∈ N , it is easily verified that M [0 , ∞ ) ⊂ M ∞ . Let P := P [0 , be the ψ -preserving conditional expectation from M onto M . Then from the commuting square above, wehave E M ∞ Q = P . This in turn gives P [0 , ∞ ) Q = P [0 , ∞ ) E M ∞ Q = P [0 , ∞ ) P = P . Hence we getthat M is a von Neumann subalgebra of M such that M α ⊂ M∪ ∪M ⊂ M [0 , ∞ ) forms a commuting square. By Theorem 3.3.3, (cid:0) M , ψ, α , M (cid:1) is a stationary Markov process. (cid:3) Corollary 3.3.5.
Suppose ρ : F + → End( M , ψ ) is a representation with α m = ρ ( g m ) , for m ∈ N .Then the quadruple (cid:0) M , ψ, α m , M n (cid:1) is a stationary Markov process for any ≤ m ≤ n < ∞ .Proof. Consider the representation ρ m,n := ρ ◦ sh m,n : F + → End( M , ψ ) where sh m,n denotes the( m, n )-partial shift as introduced in Definition 2.1.1. We observe that ρ m,n ( g ) = ρ ( g m ) and ρ m,n ( g k ) = ρ ( g n + k ) for all k ≥
1. In particular we get T k ≥ M ρ m,n ( g k ) = T k ≥ M ρ ( g k + n ) = T k ≥ n +1 M ρ ( g k ) = M n . Thus Corollary 3.3.4 applies for the ( m, n )-shifted representation ρ m,n and its application completesthe proof. (cid:3) Corollary 3.3.6.
Suppose ρ : F + → End( M , ψ ) is a generating representation. Then the quadruple (cid:0) M , ψ, α m , M α n +1 (cid:1) is a stationary Markov process for any ≤ m ≤ n < ∞ .Proof. If the representation ρ is generating, then M α n +1 = M n . Hence the result follows by Corollary3.3.5. (cid:3) Remark 3.3.7.
The commuting square assumption in Theorem 3.3.3 may not be satisfied for anoncommutative stationary process ( M , ψ, α , A ) if one only demands that the generator A is a ψ -conditioned von Neumann subalgebra of the fixed point algebra M α . Consequently the canonicalfiltration of a noncommutative stationary processes may not be Markovian, but it is always adapted tothe Markovian filtration which is given by the canonical filtration of the noncommutative stationaryMarkov process ( M , ψ, α , M α ). Theorem 3.3.8.
Let the probability space ( M , ψ ) be equipped with the representation ρ : F + → End( M , ψ ) and the filtration A • ≡ ( A I ) I ∈I ( N ) , where A I := W i ∈ I ρ ( g i )( A ) for some von Neumannsubalgebra A of M . Further suppose the inclusions M ρ ( g m +1 ) ⊂ M∪ ∪A [0 ,m ] ⊂ A [0 , ∞ ) form a commuting square for all m ≥ . Then each cell in the following triangular tower of inclusionsis a commuting square: A [0 , ⊂ A [0 , ⊂ A [0 , ⊂ A [0 , ⊂ A [0 , ⊂ · · · ⊂ A [0 , ∞ ) ∪ ∪ ∪ ∪ ∪A [1 , ⊂ A [1 , ⊂ A [1 , ⊂ A [1 , ⊂ · · · ⊂ A [1 , ∞ ) ∪ ∪ ∪ ∪A [2 , ⊂ A [2 , ⊂ A [2 , ⊂ · · · ⊂ A [2 , ∞ ) ∪ ∪ ∪ ... ... ...In particular, ( A I ) I ∈I ( N ) is a Markov filtration.Proof. All claimed inclusions in the triangular tower are clear from the definition of A [ m,n ] . We recallfrom Proposition 3.3.1 that α k ( A ) ⊆ M α n +1 for 0 ≤ k ≤ n . Hence A [ m,n ] ⊂ M α n +1 for all 0 ≤ m ≤ n .Next we show that, for 0 ≤ k and 1 ≤ m , the cell of inclusions α k ( A [0 ,m ] ) ⊂ α k ( A [0 ,m +1] ) ∪ ∪ α k +10 ( A [0 ,m − ) ⊂ α k +10 ( A [0 ,m ] )forms a commuting square. So, as P I denotes the normal ψ -preserving conditional expectation from M onto A I , we need to show P [ k,m + k ] P [ k +1 ,m + k +1] = P [ k +1 ,m + k ] or, equivalently, P [ k,m + k ] α k +10 P [0 ,m ] = α k +10 P [0 ,m − . Indeed, we calculate P [ k,m + k ] α k +10 P [0 ,m ] = P [ k,m + k ] Q m + k +1 α k +10 P [0 ,m ] = P [ k,m + k ] α k +10 Q m P [0 ,m ] = P [ k,m + k ] α k +10 Q m P [0 , ∞ ) P [0 ,m ] = P [ k,m + k ] α k +10 P [0 ,m − P [0 ,m ] = P [ k,m + k ] α k +10 P [0 ,m − = α k +10 P [0 ,m − . Here we have used that P [ k,m + k ] = P [ k,m + k ] Q m + k +1 , the intertwining properties of α and the com-muting square assumption Q m P [0 , ∞ ) = P [0 ,m − .Since α k ( A [ m,n ] ) = A [ m + k,n + k ] is evident from the definition of the filtration, we have verified thateach cell of inclusions in the triangular tower forms a commuting square. (cid:3) More generally, we may consider a probability space which is equipped both with a filtration and arepresentation of the Thompson monoid, and formulate compatiblity conditions between the filtrationand the representation such that one obtains rich commuting square structures.
Corollary 3.3.9.
Suppose the probability space ( M , ψ ) is equipped with a filtration ( N I ) I ∈I ( N ) anda representation ρ : F + → End( M , ψ ) such that(i) ρ ( g ) (cid:0) N I (cid:1) = N I +1 for all I ∈ I ( N ) (compatibility), ARKOVIANITY AND THE THOMPSON MONOID F + (ii) N [0 ,m ] ⊂ M ρ ( g m +1 ) for all m ∈ N (adaptedness),(iii) the inclusions M ρ ( g m +1 ) ⊂ M∪ ∪N [0 ,m ] ⊂ N [0 , ∞ ) form a commuting square for all m ∈ N .Then each cell in the following triangular tower of inclusions is a commuting square: N [0 , ⊂ N [0 , ⊂ N [0 , ⊂ N [0 , ⊂ · · · ⊂ N [0 , ∞ ) ∪ ∪ ∪ ∪ ρ ( g ) (cid:0) N [0 , (cid:1) ⊂ ρ ( g ) (cid:0) N [0 , (cid:1) ⊂ ρ ( g ) (cid:0) N [0 , (cid:1) ⊂ · · · ⊂ ρ ( g ) (cid:0) N [0 , ∞ ) (cid:1) ∪ ∪ ∪ ρ ( g ) (cid:0) N [0 , (cid:1) ⊂ ρ ( g ) (cid:0) N [0 , (cid:1) ⊂ · · · ⊂ ρ ( g ) (cid:0) N [0 , ∞ ) (cid:1) ∪ ∪ ... ... ...In particular, ( N I ) I ∈I ( N ) is a Markov filtration.Proof. Let P I be the normal ψ -preserving conditional expectation onto N I . Let α n = ρ ( g n ) and Q n bethe normal ψ -preserving conditional expectation onto M α n as before. We observe that N = N [0 , ⊂M α by the given adaptedness. Adaptedness also gives us N [ m,n ] ⊂ N [0 ,n ] ⊂ M α n +1 for 0 ≤ m ≤ n .Thus P [ k,m + k ] = P [ k,m + k ] Q m + k +1 as before. The rest of the proof follows just as in Theorem 3.3.8. (cid:3) A noncommutative version of the de Finetti theorem.
Most results of the previous twosubsections can be reformulated in terms of sequences of random variables associated to stationaryprocesses (see Definition 2.6.1).
Proposition 3.4.1.
Given the representation ρ : F + → End( M , ψ ) , let A be some fixed ψ -conditionedvon Neumann subalgebra of M = T k> M ρ ( g k ) and ϕ := ψ | A . Then the sequence of randomvariables ( ι n ) n ≥ : ( A , ϕ ) → ( M , ψ ) , ι n := ρ ( g ) n | A (associated to the stationary process ( M , ψ, ρ ( g ) , A ) ) is partially spreadable. Furthermore this sta-tionary process and its associated sequence of random variables have the same canonical filtration A • ≡ (cid:0) A I := _ i ∈ I ρ ( g i )( A ) (cid:1) I ∈I ( N ) which is adapted to the Markovian filtration M • ≡ (cid:0) M I := _ i ∈ I ρ ( g i )( M ) (cid:1) I ∈I ( N ) . Proof.
This is immediate from Definition 1.0.2, where we introduced partial spreadability as a dis-tributional symmetry. Clearly the canonical filtration of the stationary process and its associatedsequence of random variables coincides. The inclusion A ⊂ M ensures that A • is adapted to M • .The Markovianity of M • is infered from Corollary 3.3.4. (cid:3) As already mentioned in Remark 3.3.7, the canonical filtration A • of a stationary process may notnecessarily be Markovian, but there exists always a Markovian filtration M • to which the canonicalfiltration is adapted. Note also that the Markovianity of the canonical filtration A • may not implythat it equals the fixed point filtration M • .We are ready for the proof of a noncommutative version of de Finetti’s theorem, as formulated inTheorem 1.0.4, and repeat its formulation for the convenience of the reader. Theorem.
Let ι ≡ ( ι n ) n ≥ : ( A , ϕ ) → ( M , ψ ) be a sequence of (identically distributed) random vari-ables and consider the following conditions:(a) ι is partially spreadable; (b) ι is stationary and conditionally Markovian;(c) ι is identically distributed and conditionally Markovian.Then one has the following implications: (a) = ⇒ (b) = ⇒ (c) . Proof (of Theorem 1.0.4). (a) = ⇒ (b): The stationarity of ι follows from ψ ◦ ρ ( g ) = ψ . Proposition3.4.1 ensures that the canonical filtration A • is adapted to the Markov filtration M • . Thus thesequence ι is M • -Markovian or conditionally Markovian, according to Definition 2.5.7.(b) = ⇒ (c): Stationary sequences are identically distributed. (cid:3) Constructions of representations of the Thompson monoid F + This section is about how to construct representations of the Thompson monoid F + as they naturallyarise in noncommutative probability theory. It will be seen that such constructions are intimatelyrelated with the construction of stationary Markov processes. In particular, this will establish that alarge class of stationary Markov sequences is partially spreadable.4.1. Tensor product constructions.
Let ( A , ϕ ) and ( C , χ ) be probability spaces. Taking the infinitevon Neumann algebraic tensor product with respect to an infinite tensor product state,( M , ψ ) := (cid:0) A ⊗ C ⊗ N , ϕ ⊗ χ ⊗ N (cid:1) is a probability space which can be equipped with a representation of the semigroup of partial shifts S + and the Thompson monoid F + . For n ∈ N , let β n denote the partial shift which acts on theweak*-total set of finite elementary tensors in M as β n ( a ⊗ x ⊗ · · · ⊗ x n − ⊗ x n ⊗ x n +1 ⊗ · · · ) := a ⊗ x ⊗ · · · ⊗ x n − ⊗ C ⊗ x n ⊗ x n +1 ⊗ · · · . Proposition 4.1.1.
The maps h n β n =: ̺ ( h k ) , with n ∈ N , extend multiplicatively to a represen-tation ̺ : S + → End( M , ψ ) which has the generating property.Proof. Each β n extends to a unital injective *-homomorphism on M , denoted by the same symbol, suchthat ψ ◦ β n = ψ . As the modular automorphism group of ( M , ψ ) equals the von Neumann algebraictensor product of the modular automorphism groups of its tensor factors, i.e. σ ψt = σ ϕt ⊗ ( σ χt ) ⊗ N , itis easily verified that β n σ ψt = σ ψt β n for all t ∈ R . Thus β n ∈ End( M , ψ ) for all n ∈ N . For 0 ≤ k ≤ ℓ < ∞ , the relations β k β ℓ = β ℓ +1 β k can be directly checked on elementary tensors. Consequently theendomorphisms β , β , . . . satisfy the relations of the monoid generators h , h , . . . ∈ S + . Finally, thegenerating property of the representation ̺ is inferred from A ⊗ C ⊗ n ⊗ C ⊗ N ⊂ M β n which ensuresthat the unital *-algebra S n ∈ N M β n is weak*-dense in M . (cid:3) Let ǫ : F + → S + be the monoid epimorphism with ǫ ( g n ) = h n for all n ∈ N . Then F + ∋ g ̺ ◦ ǫ ( g ) ∈ End( M , ψ )defines a representation of the Thompson monoid F + which also has the generating property. Moregeneral representations of F + can be constructed as follows.Given the two random variables C : ( A , ϕ ) → ( A ⊗ C , ϕ ⊗ χ ) and D : ( C , χ ) → ( C ⊗ C , χ ⊗ χ ), let α n denote the C -linear extension of the map defined on a weak*-total subset of M by α n ( a ⊗ x ⊗ x ⊗ · · · ) := C ( a ) ⊗ x ⊗ x ⊗ · · · if n = 0 a ⊗ D ( x ) ⊗ x ⊗ · · · if n = 1 a ⊗ x ⊗ · · · ⊗ D ( x n − ) ⊗ · · · if n > Proposition 4.1.2.
The maps g n α n =: ρ ( g n ) , with n ∈ N , extend multiplicatively to a represen-tation ρ : F + → End( M , ψ ) which has the generating property.Proof. For 0 ≤ k < ℓ < ∞ , the relations α k α ℓ = α ℓ +1 α k are verified in a straightforward computationon finite elementary tensors of the form x = a ⊗ x ⊗ · · · ⊗ x n ⊗ ⊗ N C . (cid:3) ARKOVIANITY AND THE THOMPSON MONOID F + The representation ρ of F + may be considered as a perturbation of the representation ̺ of S + bylocally acting operators C and D on the infinite tensor product factors of M . To be more precise, thechoice D ( x ) = C ⊗ x yields α n = ( β n − β ∗ n ) β n (= β n − ) for n ≥ α = ( α β ∗ ) β . Theorem 4.1.3.
Let ̺ : S + End( M , ψ ) is the representation as introduced in Proposition 4.1.1.Then ( M , ψ, β , M β ) is a (noncommutative) Bernoulli shift with generator M β = A ⊗ C ⊗ ⊗ N C .Proof. Let B I := W i ∈ I β i ( M β ) for I ∈ I ( N ) and note that B [0 , = M β . It is straightforward tocheck that M β = A ⊗ C ⊗ C N and, more generally, B [ m,n ] = A ⊗ ⊗ m C ⊗ C ⊗ n − m +1 ⊗ ⊗ N C for 0 ≤ m ≤ n .Since B N = M , the stationary process ( M , ψ, β , M β ) is minimal. We are left to show that thisminimal stationary process is actually a noncommutative Bernoulli shift (in the sense of Definition2.6.4). Clearly, M β ⊂ M β as M β = A ⊗ C N . We are left to verify the factorization Q ( xy ) = Q ( x ) Q ( y )for any x ∈ B I , y ∈ B J whenever I ∩ J = ∅ . Here Q is the ψ -preserving normal conditional expectationfrom M onto M β . As the conditional expectation Q is of tensor type, i.e. Q ( a ⊗ x ⊗ x ⊗ · · · ⊗ x N ⊗ ⊗ N C ) = a ⊗ χ ( x ) χ ( x ) · · · χ ( x N ) ⊗ N C , the required factorization easily follows. (cid:3) Theorem 4.1.4.
Let ρ : F + End( M , ψ ) be a representation as introduced in Proposition 4.1.2.Then ( M , ψ, α , M α ) is a stationary Markov process with generator M α . Moreover ( M , ψ, α , A ) is a stationary Markov process with generator A := A ⊗ ⊗ N C ⊂ M α . These Markov processes may not be minimal. Note also that A may be strictly included in M α ,as the latter depends on the choice of the operator D . For example, strict inclusion occurs for thechoice D ( x ) = x ⊗ C , but equality occurs for the choice D ( x ) = C ⊗ x in (4.1.1). Proof.
The Markovianity of the stationary process ( M , ψ, α , M α ) follows from Corollary 3.3.6, ifwe can verify the generating property of the representation ρ : F + End( M , ψ ). Indeed, that ρ isgenerating is inferred from A ⊗ C ⊗ n − ⊗ C ⊗ N ⊆ M α n .We are left to show that the canonical filtration of the stationary process ( M , ψ, α , A ) is Mar-kovian. We note that the definition of the endomorphism α is independent of the choice of theoperator D in (4.1.1). Moreover, the inclusion A := A ⊗ C N ⊂ M α is valid for any choice of theoperator D . Now Corollary 3.3.6 can again be applied to ensure Markovianity if there exists some D : C → C ⊗ C such that A = M α . It is immediately verified that this equality occurs for the choice D ( x ) = C ⊗ x . (cid:3) Remark 4.1.5.
We remind the reader that the generating property of ρ and the relations of F + guarantee that the fixed point algebras M α n form a tower of inclusions, even though we may not knowexplicitly what the fixed point algebras are. In particular, we get M α ⊂ M α . It is not obvious tosee this directly (without using the relations of F + ) for a general operator D , as used in (4.1.1) for thedefinition of the α n ’s. However, the choice D ( x ) = C ⊗ x yields α n = β n − for all n ≥
1. We infer fromthis that M α = M β = A ⊗ C ⊗ N ⊂ A ⊗ C ⊗ C ⊗ N = M β = M α . Similarly, choosing D ( x ) = x ⊗ C ,we get α n = β n for all n ≥
0; and the inclusion M α ⊂ M α is clear from the inclusion of fixed pointalgebras of the β n ’s. Finally, we note that if D is one of the above special random variables, thenMarkovianity can be proved without appealing to the Thompson monoid F + , see [Go04, Section 2.1].Above we directly constructed some representations of the Thompson monoid F + on infinite tensorproducts of noncommutative probability spaces and invoked some of our general results about suchrepresentations from Section 3 to obtain noncommutative stationary Markov processes. We presentnext a converse result which starts with a certain class of noncommutative stationary Markov processes(in the sense of K¨ummerer). Subsequently we will make use of the following result. Proposition 4.1.6.
Let ( M , ψ, α, M ) be a stationary Markov process. Let T = E M α i be theassociated transition operator, where E M is the unique normal conditional expectation from M onto M and i is the embedding of M in M . Then the following diagram commutes for all n ∈ N : ( M , ψ ↾ M ) ( M , ψ ↾ M )( M , ψ ) ( M , ψ ) T n i E M α n In other words, we get a dilation of all orders.Proof.
This result is known in the theory of bilateral stationary Markov processes (see [K¨u85, Proposi-tion 2.2]). We provide the proof here as needed in our theory of unilateral noncommutative stationaryprocesses. Let M I = W n ∈ I α n ( i ( M )) and let P I be the unique normal ψ -preserving conditional expec-tation onto M I and observe that P [0 , = E M . Then the relation P I + k ◦ α k = α k ◦ P I , k ∈ N , I ⊂ N can be seen using the adjoint α ∗ of the (injective) endomorphism α . In particular, we get P [ k − ,k − α k i = P [ k − ,k − α k − αi = α k − P [0 , αi = α k − T, ∀ k ∈ N . (4.1.2)Now we prove the dilation property by induction. We know that E M α n i = T n is true for n = 0 , E M α n i = T n for some n ∈ N . Then E M α n +1 i = P [0 , α n +1 i = P [0 , P [0 ,n ] α n +1 i (as M [0 , ⊂ M [0 ,n ] )= P [0 , P [ n,n ] α n +1 i (by Markovianity)= P [0 , α n T (by Equation (4.1.2))= α n α = α n +1 (by induction hypothesis) . (cid:3) We show next that if a Markov operator has a tensor dilation (in the terminology of K¨ummerer[K¨u85]) then this Markov operator can be obtained as the compression of a represented generator ofthe Thompson monoid F + . Recall that T ∗ denotes the adjoint of a Markov operator T , see Subsection2.3. Theorem 4.1.7.
Let ( A ⊗ C , ϕ ⊗ χ, T, A ⊗ C ) be a stationary Markov process and let i : A ⊗ C →A ⊗ C denote the canonical embedding. Then there exists a probability space ( M , ψ ) , a generatingrepresentation ρ : F + → End( M , ψ ) and an embedding j : ( A ⊗ C , ϕ ⊗ χ ) → ( M , ψ ) such that(i) j ( A ⊗ C ) = M ρ ( g ) ,(ii) i ∗ T n i = j ∗ ρ ( g n ) j ↾ A⊗ C for all n ∈ N .Proof. We take ( M , ψ ) := (cid:0) A ⊗ C ⊗ N , ϕ ⊗ χ ⊗ N (cid:1) and construct a representation of the Thompson monoid F + as obtained in Proposition 4.1.2. Definethe representation ρ : F + → End( M , ψ ) as ρ ( g n ) := α n as in (4.1.1) with C : A → A ⊗ C and D : C →C ⊗ C given by C ( a ) = T ( a ⊗ C ) and D ( x ) = C ⊗ x . Then it is easy to check that M ρ ( g ) = j ( A ⊗ C ).Hence by Theorem 4.1.4, it follows that ( M , ψ, ρ ( g ) , M ρ ( g ) ) is a noncommutative stationary Markovprocess.We are given that ( A ⊗ C , ϕ ⊗ χ, T, A ⊗ C ) is a stationary Markov process, hence by Proposition4.1.6, taking S = i ∗ T i as the corresponding transition operator, we get S n = i ∗ T n i, n ∈ N . (4.1.3)It is easy to check that, for all a ∈ A , j ∗ ρ ( g ) j ( a ⊗
1) = j ∗ ( T ( a ⊗ ⊗ ⊗ · · · ) = T ( a ⊗
1) = S ( a ⊗ . ARKOVIANITY AND THE THOMPSON MONOID F + Hence, j ∗ ρ ( g ) j ↾ A× C = S . Consequently the stationary Markov process ( M , ψ, ρ ( g ) , M ρ ( g ) ) hasalso the transition operator S . We once again appeal to Proposition 4.1.6 to get S n = j ∗ ρ ( g n ) j ↾ A⊗ C . (4.1.4)Combining (4.1.3) and (4.1.4) completes the proof of the theorem. (cid:3) Suppose (
A ⊗ C , ϕ ⊗ χ, T, A ⊗ C ) is a stationary Markov process. Then it is unknown in thegenerality of present noncommutative setting if there exists a representation ρ : F + → End(
A⊗C , ϕ ⊗ χ )such that the Markov shift T equals the represented generator ρ ( g ). Thus it is unknown if thecanonically associated stationary Markov sequence of random variables ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( A ⊗C , ϕ ⊗ χ ) is partially spreadable. We will see in the Subsection 4.2 that this canonically associatedsequence ι is partially spreadable in our algebraic framework for classical probability, see Theorem4.2.6. Furthermore we will investigate in Subsection 4.3 an operator algebraic setting which allows todeduce that a noncommutative stationary Markov sequence is partially spreadable.4.2. Constructions in classical probability.
The tensor product constructions from Subsection4.1 apply of course to commutative von Neumann algebras (with separable predual) as they are ofrelevance in classical probability theory: a von Neumann algebra with separable predual is isomorphicto the essentially bounded functions on some standard probability space. Most of the following con-structions and results on an algebraic reformulation of Markov processes are well-known. Neverthelesswe provide them for the convenience of the reader, in particular to discuss further the connectionbetween Markovianity and the Thompson monoid F + in classical probability. The main result of thissubsection is Theorem 4.2.6 which provides an algebraic reformulation of that a (recurrent) stationaryMarkov sequence of classical random variables induces a representation of the Thompson monoid F + such that the Markov shift of this process is given by one of the represented generators of the monoid.We recall that a stationary Markov process is already completely determined by its transition oper-ator in classical probability, up to equivalence in distribution. This folklore result can be reformulatedin present operator algebraic setting as done next. Proposition 4.2.1.
Let ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( M , ψ ) and ˜ ι ≡ (˜ ι n ) n ∈ N : ( A , ϕ ) → ( ˜ M , ˜ ψ ) be twostationary Markov sequences with Markov operators R ∈ Mor( A , ϕ ) and ˜ R ∈ Mor( A , ϕ ), respectively.If M and ˜ M are commutative von Neumann algebras, then the following are equivalent:(a) ι distr = ˜ ι ;(b) R = ˜ R . Proof. (a) = ⇒ (b): We conclude from the Markov property and from ι distr = ˜ ι that ϕ (cid:0) aR ( b ) (cid:1) = ψ (cid:0) ι ( a ) ι ( b ) (cid:1) = ˜ ψ (cid:0) ˜ ι ( a )˜ ι ( b ) (cid:1) = ϕ (cid:0) a ˜ R ( b ) (cid:1) . for all a, b ∈ A . But this implies R = ˜ R by routine arguments.(b) = ⇒ (a): We need to show that R = ˜ R implies ψ (cid:0) ι k ( a ) · · · ι k n ( a n ) (cid:1) = ˜ ψ (cid:0) ˜ ι k ( a ) · · · ˜ ι k n ( a n ) (cid:1) for any a , . . . , a n ∈ A and k , . . . k n ∈ N and n ∈ N . Since M and ˜ M are commutative von Neumannalgebras, and since random variables are (injective) *-homomorphisms, we can assume 0 ≤ k < k <. . . < k n without loss of generality. We use again the the Markov property and R = ˜ R to calculate ψ (cid:0) ι k ( a ) · · · ι k n ( a n ) (cid:1) = ϕ (cid:0) a R k − k ( a ) · · · R k n − k n − ( a n ) (cid:1) = ϕ (cid:0) a ˜ R k − k ( a ) · · · ˜ R k n − k n − ( a n ) (cid:1) = ˜ ψ (cid:0) ˜ ι k ( a ) · · · ˜ ι k n ( a n ) (cid:1) . (cid:3) Notation 4.2.2.
Throughout this subsection ν denotes the Lebesgue measure on the unit interval[0 , ⊂ R . Furthermore the (non)commutative probability space ( L , tr ν ) is given by L := L ∞ ([0 , , ν )and tr ν := R [0 , · dν . Theorem 4.2.3 ([K¨u86, Subsection 4.4]) . Let R be a Markov operator on ( A , ϕ ) , where A is acommutative von Neumann algebra with separable predual. Then there exists T ∈ Aut(
A ⊗ L , ϕ ⊗ tr ν ) such that, for all n ∈ N , R n = P T n j Here j : ( A , ϕ ) → ( A ⊗ L , ϕ ⊗ tr ν ) denotes the canonical embedding j ( a ) = a ⊗ L and P = j ∗ suchthat E := j ◦ P is the ϕ ⊗ tr ν -preserving normal conditional expectation from A ⊗ L onto
A ⊗ L . A proof of this folklore result can be found in [K¨u86]. As an automorphism is an endomorphismwe infer from Theorem 4.2.3 immediately the existence of a unilateral stationary Markov process (asintroduced in Definition 2.6.3)
Corollary 4.2.4. ( A ⊗ L , ϕ ⊗ tr L , T, A ⊗ L ) is a stationary Markov process. As already seen before in the context of tensor product constructions, each Markov operator inpresent algebraic framework of classcical probability can be obtained as the compression of a repre-sented generator of the Thompson monoid F + . Theorem 4.2.5.
There exists a probability space ( M , ψ ) , a generating representation ρ : F + → End( M , ψ ) and an embedding j : ( A ⊗ L , ϕ ⊗ tr ν ) → ( M , ψ ) such that(i) j ( A ⊗ L ) = M ρ ( g ) ,(ii) T n ↾ A⊗ L = j ∗ ρ ( g n ) j ↾ A⊗ L for all n ∈ N . Here j ∗ denotes the adjoint of the injective*-homomorphism j and can be easily seen to be equal to the conditional expectation onto A ⊗ L .Proof.
The proof follows from Corollary 4.2.4 and Theorem 4.1.7. (cid:3)
Together with our general results from Section 3, our next result strongly hints at the availability ofa de Finetti theorem for (recurrent) stationary Markov chains with values in a standard Borel space.
Theorem 4.2.6.
A stationary Markov sequence ι ≡ ( ι n ) n ∈ N : ( A , ϕ ) → ( M , ψ ) is partially spreadable.Proof. We will show that there exists a sequence ˜ ι ≡ (˜ ι n ) n ∈ N : ( A , ϕ ) → ( ˜ M , ˜ ψ ) which has the samedistribution as the stationary Markov sequence ι and which satisfies, for all n ∈ N ,˜ ρ ( g n )˜ ι = ˜ ι and ˜ ρ ( g n )˜ ι = ˜ ι n for some representation ˜ ρ : F + → End( ˜ M , ˜ ψ ).Since ι is stationary and Markovian there exist a Markov operator R on ( A , ϕ ) (i.e. R ∈ Mor( A , ϕ ))such that ϕ ( aR ( b )) = ψ (cid:0) ι ( a ) ι ( b ) (cid:1) . Now Theorem 4.2.3 implies that there exists an automorphism T ∈ Aut(
A ⊗ L , ϕ ⊗ tr ν ) such that ϕ ( aR ( b )) = ϕ ⊗ tr ν (cid:0) a ⊗ L T ( b ⊗ L ) (cid:1) . Let the random variable C : ( A , ϕ ) → ( A ⊗ L , ϕ ⊗ tr ν ) be defined by C ( a ) := T ( a ⊗ L ) , and let therandom variable D : ( L , tr ν ) → ( L ⊗ L , tr ν ⊗ tr ν ) be defined by D ( x ) = L ⊗ x. Then, as shown inProposition 4.1.2,˜ ρ ( g n )( a ⊗ x ⊗ x ⊗ · · · ) := C ( a ) ⊗ x ⊗ x ⊗ · · · if n = 0 ,a ⊗ D ( x ) ⊗ x ⊗ · · · if n = 1 ,a ⊗ x ⊗ · · · ⊗ D ( x n − ) ⊗ · · · if n > ρ : F + → End( ˜ M , ˜ ψ ), where ( ˜ M , ˜ ψ ) = (cid:0) A ⊗ L ⊗ N , ϕ ⊗ tr ⊗ N ν (cid:1) . We infer fromTheorem 4.1.4 that (cid:0) ˜ M , ˜ ψ, ˜ ρ ( g ) , A ⊗ ⊗ N L (cid:1) is a stationary Markov process such that ϕ ( aR ( b )) = ˜ ψ (cid:0) a ⊗ ⊗ N L ρ ( g )( b ⊗ ⊗ N L ) (cid:1) . Consequently, the sequence of random variables ˜ ι ≡ (˜ ι n ) n ∈ N : ( A , ϕ ) → ( ˜ M , ˜ ψ ), defined by˜ ι ( a ) := a ⊗ ⊗ N L and ˜ ι n ( a ) := ˜ ρ ( g n ) ι ( a ) for n > , ARKOVIANITY AND THE THOMPSON MONOID F + is Markovian and partially spreadable, both by construction. Furthermore the sequences ˜ ι and ι havethe same distribution, as they are stationary Markov sequences with the same Markov operator R , seeProposition 4.2.1. (cid:3) Constructions in the framework of operator algebras.
K¨ummerer’s approach to an opera-tor algebraic theory of stationary Markov processes is based on the concept of a coupling representation(see [K¨u93] for example). Here we adapt and refine this approach such that it provides a rich operatoralgebraic framework for the construction of representations of the Thompson monoid F + .Our investigations are motivated by the elementary observation that the relations of the Thompsonmonoid F + are robust under certain ‘perturbations’ which we introduce and formalize next. Definition 4.3.1.
The extended monoid EF + is presented by the set of generators { g n , c n | n ∈ N } subject to the relations g k g ℓ = g ℓ +1 g k , c k c ℓ +1 = c ℓ +1 c k , c k g ℓ +1 = g ℓ +1 c k g k c ℓ = c ℓ +1 g k (0 ≤ k < ℓ < ∞ ) . Apparently the first set of generators { g n } n ∈ N satisfies the relations of the Thompson monoid F + . Proposition 4.3.2.
The submonoid QF + := h c n g n | n ∈ N i + ⊂ EF + is a quotient of the monoid F + .Proof. An elementary computation, based on all defining relations of the monoid EF + , shows thatthe elements of the set { ˜ g n := c n g n | n ∈ N } satisfy the relations of the Thompson monoid F + for0 ≤ k < ℓ < ∞ :˜ g k ˜ g ℓ = c k g k c ℓ g ℓ = c k c ℓ +1 g k g ℓ = c ℓ +1 c k g k g ℓ = c ℓ +1 c k g ℓ +1 g k = c ℓ +1 g ℓ +1 c k g k = ˜ g ℓ +1 ˜ g k . (cid:3) Definition 4.3.3.
The extended monoid ES + is presented by the set of generators { h n , c n | n ∈ N } subject to the relations h k h ℓ = h ℓ +1 h k , c k c ℓ +1 = c ℓ +1 c k , c k h ℓ +1 = h ℓ +1 c k , h k c l = c l +1 h k and h k h k = h k +1 h k for every 0 ≤ k < ℓ < ∞ .Clearly the monoid ES + is a quotient of the monoid EF + , due to the additional set of relations forthe h k ’s. The extended monoid ES + algebraically encodes certain local perturbations of the partialshifts monoid S + which, roughly phrasing, corresponds to the perturbation of Bernoulli shifts suchthat one obtains Markov shifts in classical probability. Proposition 4.3.4.
The submonoid QS + := h c n h n | n ∈ N i + ⊂ ES + is a quotient of the monoid F + .Proof. An elementary computation shows that the elements of the set { ˜ g n := c n h n | n ∈ N } satisfythe relations of the Thompson monoid F + . (cid:3) Remark 4.3.5.
Actually the relations in Definition 4.3.1 have been identified by some reverse en-gineering: each c k should provide a suitable ‘local perturbation’ of g k such that ( c k g k )( c ℓ g ℓ ) =( c ℓ +1 g ℓ +1 )( c k g k ) for 0 ≤ k < ℓ < ∞ . An alternative ‘perturbation’ is given by the extended monoid F F + which is defined to be presented by generators { c n , g n } n ∈ N subject to the relations g k g ℓ = g ℓ +1 g k , c k c ℓ = c ℓ +1 c k , c k g ℓ +1 = g ℓ +1 c k g k c ℓ = c ℓ g k (0 ≤ k < ℓ < ∞ ) . Here both the c k ’s and the g k ’s satisfy the relations of the Thompson monoid F + and the last twosets of relations can be combined to a single set of relations on commutativity: c k g ℓ = g ℓ c k whenever k / ∈ { ℓ − , ℓ } . Remark 4.3.6.
The results on semi-cosimplicial structures as obtained in [EGK17] make it temptingto investigate also a more restrictive perturbed version for the partial shifts monoid S + . So let the extended semi-cosimplicial monoid ES + r be presented by the set of generators { h n , d n | n ∈ N } subjectto the relations h k h ℓ = h ℓ +1 h k , d k d ℓ +1 = d ℓ +1 d k , d k h ℓ +1 = h ℓ +1 d k h k d l = d l +1 h k (0 ≤ k ≤ ℓ < ∞ ) . These relations ensure that the submonoid QS + r := h c n g n | n ∈ N i ⊂ ES + r is a quotient of the monoid S + . In comparison to the extended monoid EF + , the additional relations are more restrictive forpossible extensions of the monoid S + . Roughly phrasing, these additional relations encode algebraicallythe perturbative difference between Markovianity and stochastic independence in classical probability.We conjecture that QS + r and S + are isomorphic as monoids.Similarly as it was discovered for the monoid F + in Section 3, the representation theory of theseextended monoids in the endomorphisms of a noncommutative probability space goes along with veryrich structures of commuting squares. Here we restrict ourselves to present a single result, mainlyin the intention to illustrate how Bernoulli shifts and, as their perturbation, Markov shifts can besimultaneously obtained from the representation theory of the extended monoid ES + .Recall from Definition 2.6.2 that a noncommutative stationary process ( M , ψ, β, M ) is spreadableif the canonically associated sequence of random variables ( λ n ) n ≥ : ( M , ψ ) → ( M , ψ ) is spreadable,where λ n := β n ↾ M and ψ = ψ ↾ M . Theorem 4.3.7.
Suppose ( M , ψ ) is equipped with a representation ρ : ES + → End( M , ψ ) . Let B := M ρ ( h ) and ( B ∞ , ψ ∞ ) := (cid:0) W n ∈ N ρ ( h n )( B ) , ψ | B∞ (cid:1) . Further let A := T k ≥ M ρ ( c k h k ) .(i) The restricted represented generator β := ρ ( h ) | B∞ defines the spreadable Bernoulli shift ( B ∞ , ψ ∞ , β, B ) .(ii) The represented generator α := ρ ( c h ) defines the (not necessarily minimal) stationaryMarkov process ( M , ψ, α, A ) .If M = B ∞ and A = B β , then the stationary Markov process ( M , ψ, α, A ) has the coupling repre-sentation ( M , ψ, γβ, A ) , with coupling γ := ρ ( c ) .Proof. (i) Let ρ B ( h n ) := β n := ρ ( h n ) , h n ∈ S + , n ∈ N . Then β = β and ρ B gives a representa-tion of the monoid S + in End( M, ψ ). Hence ( M , ψ, α, A ) is spreadable (compare Definition2.6.2). Also observe that M β = M ρ B ( h ) ⊂ M ρ B ( h ) = B due to the relations of S + . Now,the fact that it is a Bernoulli shift follows from Theorem 8.2 in [K¨o10].(ii) Let ρ M ( g n ) := α n := ρ ( c n h n ) , g n ∈ F + , n ∈ N . Then α = α and ρ M gives a representation ofthe monoid F + in End( M , ψ ). Also observe that M ρ M := T k ≥ M ρ M ( g k ) = T k ≥ M ρ ( c k h k ) = A , hence by Corollary 3.3.4, ( M , ψ, α, A ) is a stationary Markov process. (cid:3) The significance of this result is that it indicates a promising strategy of how to construct a represen-tation of the Thompson monoid F + from a large class of noncommutative stationary Markov processes.The starting point is the construction of a spreadable noncommutative Bernoulli shift which is knownto be in a bijective correspondence to equivalence classes of spreadable sequences of noncommutativerandom variables (see [K¨o10, EGK17]). In other words, the construction of a spreadable Bernoulli shiftamounts to the construction of a representation of the partial shift monoid S + . But as this monoid isa quotient of the Thompson monoid F + , spreadable Bernoulli shifts correspond to a particular classof representations of the Thompson monoid F + . Suitable perturbations of this particular class willprovide certain Markov shifts and wider classes of representations of the Thompson monoid F + . Proposition 4.3.8.
Let ( M , ψ, β, M ) be a spreadable noncommutative Bernoulli shift. Then thereexists a generating representation ρ β : S + → End( M , ψ ) such that β = ρ β ( h ) and M ⊂ M ρ β ( h k ) forall k ≥ .Proof. If ( M , ψ, β, M ) is spreadable, then as it is a minimal noncommutative Bernoulli shift, thereexists a representation ρ β : S + → End( M , ψ ) such that for λ n := β n ↾ M we get λ n = ρ β ( h n ) λ forall n ∈ N and ρ β ( h k ) λ = λ for all k ≥ ρ β hasthe generating property by construction. (cid:3) ARKOVIANITY AND THE THOMPSON MONOID F + Definition 4.3.9.
The representation ρ β : S + → End( M , ψ ) (as introduced in Proposition 4.3.8) issaid to be associated to the spreadable Bernoulli shift ( M , ψ, β, M ). Corollary 4.3.10.
A spreadable Bernoulli shift ( M , ψ, β, M ) is partially spreadable. Proof.
As in Proposition 4.3.8, let ρ β be the representation associated to the spreadable noncommu-tative Bernoulli shift. Denote by ǫ : S + → F + the canonical epimorphism which maps the generator g k ∈ F + to the generator h k ∈ S + for all k ∈ N . Then ρ := ρ β ◦ ǫ defines a representation of F + such that the canonically associated sequence of random variables ( λ n ) n ≥ (as used in the proof ofProposition 4.3.8) is partially spreadable. (cid:3) A large class of noncommutative Markov shifts can be obtained as certain perturbations of noncom-mutative Bernoulli shifts, as developed and investigated by K¨ummerer in [K¨u86, K¨u85]. We refine thenotion of a coupling representation so that it applies to spreadable noncommutative Bernoulli shifts.
Definition 4.3.11.
A sequence ( γ n ) n ≥ ∈ End( M , ψ ) is called a coupling (sequence) to a spreadableBernoulli shift ( M , ψ, β, M ) with associated representation ρ : S + → End( M , ψ ) if, for all 0 ≤ k <ℓ < ∞ , ρ ( h k ) γ ℓ = γ ℓ +1 ρ ( h k ) , γ k ρ ( h ℓ +1 )= ρ ( h ℓ +1 ) γ k , γ k γ ℓ +1 = γ ℓ +1 γ k . Proposition 4.3.12.
Let ( γ n ) n ≥ be a coupling sequence to the spreadable Bernoulli shift ( M , ψ, β, M ) with associated representation ρ β : S + → End( M , ψ ) . Then a representation ρ : F + → End( M , ψ ) isdefined by the multiplicative extension of F + ∋ g k γ k ρ β ǫ ( g k ) ∈ End( M , ψ ) (0 ≤ k < ∞ ) . Here ǫ denotes the canonical epimorphism from F + onto S + .Proof. The relations of the Thompson monoid F + are satisfied by γ k ρ β ε ( g k ) as for 0 ≤ k < ℓ < ∞ ,the definition of a coupling sequence ensures that( γ k ρ β ε ( g k ))( γ ℓ ρ β ε ( g ℓ )) = ( γ k ρ β ( h k ))( γ ℓ ρ β ( h ℓ ))= γ k γ ℓ +1 ρ β ( h k ) ρ β ( h ℓ )= γ ℓ +1 γ k ρ β ( h ℓ +1 ) ρ β ( h k )= γ ℓ +1 ρ β ( h ℓ +1 ) γ k ρ β ( h k )= ( γ ℓ +1 ρ β ε ( g ℓ +1 ))( γ k ρ β ε ( g k )) . (cid:3) Remark 4.3.13.
It is known that there exist non-spreadable noncommutative Bernoulli shifts (see[K¨o10]). We conjecture that there exist also noncommutative Bernoulli shifts without partial spread-ability. An affirmative answer to this conjecture would of course imply that there exist noncommutativeMarkov shifts beyond the representation theory of the Thompson monoid F + . References [Be04] J. Belk. Thompsons group F . PhD thesis , Cornell University (2004).[DT18] P. Dehornoy, and E. Tesson. Garside combinatorics for Thompson’s monoid F + and a hybrid with the braidmonoid B + ∞ . eprint arXiv:1803.02639v2 (2018)[DF80] P. Diaconis, and D. Freedman. De Finetti’s Theorem for Markov Chains. Ann. Probab. , Volume 8, Number 1(1980), 115-130.[EGK17] D. G. Evans, R. Gohm, and C. K¨ostler. Semi-cosimplicial objects and spreadability.
Rocky Mountain J. Math.
47, no. 6, 18391873 (2017).[Go04] R. Gohm. Noncommutative stationary processes.
Springer , 2004.[GK09a] R. Gohm, and C. K¨ostler Noncommutative independence from the braid group B ∞ . Comm. Math. Phys. ,289:435-482, 2009[GK09b] R. Gohm, and C. K¨ostler Noncommutative independence from characters of the infinite symmetric group S ∞ . eprint arXiv:1005.5726v1 (2010) [GHJ89] F.M. Goodman, P. de la Harpe, and V.F.R. Jones. Coxeter Graphs and Towers of Algebras.
Springer-Verlag,1989.[HM11] U. Haagerup, and M. Musat. Factorization and dilation problems for completely positive maps on von Neumannalgebras.
Comm. Math. Phys.
Probabilistic Symmetries and Invariance Principles.
Springer-Verlag, 2005.[K¨o10] C. K¨ostler A noncommtutative extended de Finetti theorem.
Journal of Functional Analysis S ∞ , which relates to the law of a GUEmatrix. eprint arXiv:1807.05633 (2018)[K¨u85] B. K¨ummerer. Markov dilations on W*-algebras. Journal of Functional Analysis
63 (1985): 139-177.[K¨u86] B. K¨ummerer. Construction and structure of Markov dilations on W*-algebras.
Habilitationsschrift , T¨ubingen,1986.[K¨u93] B. K¨ummerer. Stochastic Processes with values in M n as couplings to free evolutions. Preprint , 1993.[GK12] A. G¨artner, and B. K¨ummerer. A Coherent Approach to Recurrence and Transience for Quantum MarkovOperators. eprint arXiv:1211.6876 (2012)[Jo18] V.F.R. Jones. A no-go theorem for the continuum limit of a periodic quantum spin chain.
Comm. Math. Phys. ,357(1):295317, 2018.[Ta72] M. Takesaki. Conditional expectations in von Neumann algebras
Journal of Functional Analysis
School of Mathematical Sciences, University College Cork, Cork, Ireland
E-mail address : [email protected] E-mail address : [email protected] E-mail address ::