A Coupling Proof of Convex Ordering for Compound Distributions
AA COUPLING PROOF OF CONVEX ORDERING FORCOMPOUND DISTRIBUTIONS
JEAN BÉRARD AND NICOLAS JUILLET
Abstract.
In this paper, we give an alternative proof of the fact that,when compounding a nonnegative probability distribution, convex or-dering between the distributions of the number of summands impliesconvex ordering between the resulting compound distributions. Al-though this is a classical textbook result in risk theory, our proof ex-hibits a concrete coupling between the compound distributions beingcompared, using the representation of one-period discrete martingalelaws as a mixture of the corresponding extremal measures. Introduction
Consider an i.i.d. sequence of nonnegative random variables X = ( X i ) i (cid:62) ,and two integer-valued random variables M, N , independent from the se-quence X . Assume that E ( X ) < + ∞ , E ( M ) < + ∞ , E ( N ) < + ∞ , andthat a comparison between M and N holds with respect to the convex ordering: M ≺ cx N . We then have the following comparison between thecompound variables X + · · · + X M and X + · · · + X N with respect to theconvex ordering:(1) X + · · · + X M ≺ cx X + · · · + X N . This is a classical result (see e.g. Theorem 4.A.9 in [17] or Theorem 4.3.6in [13]) , useful in the context of risk theory (see e.g. [10], chap. 7) whereits interpretation is that compounding with a riskier frequency distributionleads to a riskier aggregated loss distribution.The proof given in [17] is analytical in nature, and consists in showingthat, given a non-decreasing convex function f : [0 , + ∞ [ → R , the sequence ( u n ) n (cid:62) defined by u n = E f ( X + · · · + X n ) satisfies, for all n (cid:62) , thecondition u n +2 − u n +1 (cid:62) u n +1 − u n (provided that u n +2 , u n +1 , u n have afinite value) . Here, we give an alternative proof of (1), based on a coupling We refer to [17, 13] for the definition and main properties of the convex ordering ofprobability measures. In [17, 13], the theorems are stated with respect to the increasing convex order, andinvolve two sequences of random variables instead of just one, so that (1) appears as aspecial case of these results. We refer to Section 4.3 for a discussion of how the moregeneral case can be deduced from (1). The proof given in [13] is similar, using functions of the special form f ( x ) = ( x − t ) + . a r X i v : . [ m a t h . P R ] O c t JEAN BÉRARD AND NICOLAS JUILLET between the two random variables being compared, which provides an ex-plicit realization of Strassen’s condition (see [18] or e.g. [13] section 1.5.2)for convex ordering: we construct a pair of random variables ( A, B ) suchthat(2) A d = X + · · · + X M , B d = X + · · · + X N , and A = E ( B | σ ( A )) a.s.Our proof relies a representation result which we call a diatomic represen-tation of convex ordering, stated as Theorem 1 in Section 2, where we reviewseveral approaches for proving the existence of this representation, includingan explicit algorithm in the case of discrete distributions. Section 3 containsthe coupling construction leading to (2) and the proof of (1). Finally, inSection 4, we discuss various extensions of these results.2. Diatomic representation
Given two real numbers x < y , we define, for all z the normalized barycen-tric coordinates: α x,y ( z ) = y − zy − x and β x,y ( z ) = 1 − α x,y ( z ) = z − xy − x . In the case x = y , we extend the above definition by setting α x,y ( z ) = 1 and β x,y ( z ) = 0 . Whenever x (cid:54) z (cid:54) y , both α x,y ( z ) and β x,y ( z ) lie in theinterval [0 , , and z can be written as the convex combination of x and y : z = α x,y ( z ) · x + β x,y ( z ) · y. Theorem . Given two probability distributions µ, ν on R possessing a finiteexpectation, the comparison µ ≺ cx ν holds if and only if there exists a tripleof random variables ( V − , U, V + ) defined on the same probability space andsuch that:(3) U ∈ [ V − , V + ] (4) Law ( U ) = µ (5) ν = E (cid:104) α V − ,V + ( U ) · δ V − + β V − ,V + ( U ) · δ V + (cid:105) We call such a triple ( V − , U, V + ) a diatomic representation of the stochasticordering µ ≺ cx ν .A more concrete statement of (5) is that Law ( V ) = ν , where V is arandom variable whose conditional distribution with respect to V − , U, V + isgiven by: V = (cid:40) V − with probability α V − ,V + ( U ) V + with probability β V − ,V + ( U ) Theorem 1 may not have been stated under this specific form in the math-ematical literature, but its content is certainly not new. In the followingsubsections we review several ways of proving this result.
COUPLING PROOF OF CONVEX ORDERING FOR COMPOUND DISTRIBUTIONS 3
Proof of Theorem 1 via Choquet’s and Douglas’ theorems (forcompactly supported measures ν ). We prove here the result only in thecase of measures µ ≺ cx ν concentrated on some closed interval K .Consider the following space of measures on R : S K = (cid:26) π ∈ M + ( K ) : ∀ a ∈ R , ∀ b ∈ C b ( R ) , (cid:90) (cid:90) a + b ( u ) . ( v − u )d π ( u, v ) = a (cid:27) . Here M + ( K ) is the space of finite positive Borel measures on R withsupport contained in K , and C b ( R ) the space of real-valued continuous andbounded functions on R . In other words, denoting by F the space of functions f a,b : ( u, v ) ∈ R → a + b ( u )( v − u ) , the above definition reads: S K = { π ∈ M + ( K ) : ∀ f a,b ∈ F, (cid:90) (cid:90) f a,b d π = a } . It turns out that S K is a non-empty space of probability measures, knownas martingale measures in the literature since for ( X , X ) ∼ π ∈ S K , theprocess ( X i ) i =1 , is a martingale on its natural filtration.Our task amounts to proving that a measure π ∈ S K is represented as amixture of ‘triplet’ measures of the form δ z ⊗ [ αδ x + βδ y ] , where z = αx + βy and α, β are nonnegative and satisfy α + β = 1 .We admit without proof that S K is convex and compact for the weaktopology. Choquet’s Theorem (see [3] or e.g. [15]) then implies that everymeasure in S K can be represented as a mixture of extremal measures in S K .So we shall be done as soon as we can prove that the extremal measuresin S K are triplet measures concentrated on S K . Let η be such an extremalmeasure and µ, ν its marginals. We aim at proving that µ is supported ona single point and that ν is supported on at most two points. Striving for acontradiction suppose that µ is not a Dirac measure. Hence, there exists aBorel set E such that µ ( E ) / ∈ { , } . We consider the L ( η ) distance between f : ( u, v ) → E ( u ) and the functions f a,b ∈ F . Letting ( U, V ) be distributedaccording to η we find (cid:107) f − f a,b (cid:107) L ( η ) (cid:62) E (cid:16)(cid:12)(cid:12)(cid:12) E ( f ( U, V ) − f a,b ( U, V ) | U ) (cid:12)(cid:12)(cid:12)(cid:17) (cid:62) E (cid:16)(cid:12)(cid:12)(cid:12) E ( E ( U ) − a | U ) (cid:12)(cid:12)(cid:12)(cid:17) = P ( U ∈ E ) | − a | + P ( U / ∈ E ) | a | (cid:62) / P ( U ∈ E ) , P ( U / ∈ E )) . Note that this lower bounds only depends on f . We have thus proved that F is not dense in L ( η ) . According to Douglas’s theorem [4] (see also [16,Chapter V]) this in contradiction with the fact that η is extremal. Thus η isof type δ u ⊗ ν where u is the barycenter of ν , i.e (cid:82) v d ν ( v ) = u . Next, againstriving for a contradiction suppose that there exists a partition ( A i ) i ∈{ , , } of R such that ν ( A i ) > holds for every i . From Douglas’s Theorem againthe set F of functions is dense in L ( η ) . In particular any function g c ,c ,c :( u, v ) (cid:55)→ (cid:80) i =1 c i A i ( v ) can be approximated in L ( η ) by functions f a,b . JEAN BÉRARD AND NICOLAS JUILLET
The linear map ( c , c , c ) (cid:55)→ ( (cid:82) A g c ,c ,c d η, (cid:82) A g c ,c ,c d η, (cid:82) A g c ,c ,c d η ) is clearly linear and onto. It follows that the linear map ( a, b ) ∈ R (cid:55)→ ( (cid:90) A f a,b d η, (cid:90) A f a,b d η, (cid:90) A f a,b d η ) ∈ R is onto as well, a contradiction. Therefore, extremal measures of S K are oftype δ z ⊗ [ αδ x + βδ y ] where z = αx + βy , as it was required.2.2. Proof of Theorem 1 via Strassen’s theorem (general case).
An-other approach to proving Theorem 1 is to use Strassen’s theorem insteadof Choquet’s and Douglas’s theorems: there exists a kernel k : R → P ( R ) such that µ almost surely k u = k ( u ) has mean u and it holds µ · k = ν .(Such kernels are known as dilations or martingale kernels in the literature.)Hence the mixture with weight µ of the measures δ u ⊗ k u defines a probabil-ity measure π on R whose marginals are µ and ν . To complete the proof,it remains to check that each measure k u can be represented as a mixture ofdiatomic measures with mean u . This last fact is a classical step in the proofof Skorokhod’s representation theorem (see e.g. [5, Theorem 8.1.1]): everyprobability measure on R with mean u can be represented as a mixture ofdiatomic measures with mean u . See [9, §5.1] for another approach. Remark.
The search for martingale kernels k : u (cid:55)→ k u is a key ques-tion in the field of martingale optimal transport.The first completely canon-ical method seems to be the left-curtain coupling by Beiglböck and Juillet [1] that is also of particular interest to us. Not only is k u canonical butwhen µ is diffuse its kernels k u are automatically diatomic (this also holdsfor the former coupling by Hobson and Neuberger [6] under more generalassumptions). This is not the case if µ possesses atoms. However a quan-tile version of the left-curtain coupling is described in a second paper bythe same authors [2] where the martingale measure π directly appears as amixture over the set [0 , of quantile levels ω ∈ [0 , of diatomic kernels δ z ω ⊗ ( α x ω ,y ω ( z ω ) δ x ω + β x ω ,y ω ( z ω ) δ y ω ) where z ω is the ω -quantile of µ . Notethat the same can be said of the recent coupling by Jourdain and Margheriti [8] . Algorithmic proof of Theorem 1 (for finitely supported µ, ν ). We now describe an explicit algorithmic construction, inspired by [1, 2],leading to a diatomic decomposition in the case where both µ and ν arefinitely supported probability measures. This algorithm is used to producethe simulation shown in Fig. 1.Let us write µ = (cid:80) pi =1 µ ( u i ) δ u i and ν = (cid:80) qj =1 ν ( v j ) δ v j . Initialization: µ ∗ ← µ , ν ∗ ← ν , θ ∗ ← , T ← ∅ COUPLING PROOF OF CONVEX ORDERING FOR COMPOUND DISTRIBUTIONS 5
Loop:
Repeat the following steps while θ ∗ > Pick a triple ( v j − , u i , v j + ) such thata. ν ∗ ( v j − ) > , µ ∗ ( u i ) > , ν ∗ ( v j + ) > b. v j − (cid:54) u i (cid:54) v j + c. ν ∗ (( v j − , v j +)) = 0 s ← min ( µ ∗ ( u i ) , ν ∗ ( v j − ) /α v j − ,v j + ( u i ) , ν ∗ ( v j + ) /β v j − ,v j + ( u i ))) µ ∗ ← µ ∗ − sδ u i ν ∗ ← ν ∗ − sα v j − ,v j + ( u i ) δ v j − − sβ v j − ,v j + ( u i ) δ v j + θ ∗ ← θ ∗ − s T ← T ∪ { (( v j − , u i , v j + ) , s ) } Result: return the set T The probability distribution of ( V − , U, V +) is then deduced from T as (cid:88) ( v − ,u,v + ,s ) ∈ T sδ ( v − ,u,v +) . The reason why the above algorithms stops lies in the fact that the trans-formation performed on µ ∗ and ν ∗ keeps the comparison µ ∗ ≺ cx ν ∗ validthroughout the execution of the algorithm (see the proof of Lemma 2.8 in[1]), with µ ∗ and ν ∗ having equal total mass. As a consequence, as long as µ ∗ and ν ∗ do not have zero total mass, the comparaison µ ∗ ≺ cx ν ∗ ensures thata triple ( v j − , u i , v j + ) satisfying conditions a.-b.-c. exists. Finally, since ateach step at least one of the three numbers ν ∗ ( v j − ) , µ ∗ ( u i ) , ν ∗ ( v j + ) is set tozero, the total mass of both µ ∗ and ν ∗ must reach zero after a finite numberof steps. Remark.
If, in the loop part of the algorithm, one systematically choses theunique triple ( v − , u, v + ) such that u is the leftmost point in the support of µ ∗ (such a choice is always possible, see the proof of Lemma 2.8 in [1] ), theend-result of the algorithm is the so-called left-curtain coupling. Coupling construction
We now describe the coupling construction leading to our proof of (1).Consider a triple ( N − , M, N + ) as in Theorem 1, with µ = Law ( M ) and ν = Law ( N ) , and an i.i.d. sequence X = ( X i ) i (cid:62) independent from ( N − , M, N + ) .For all integer k (cid:62) , we let S k = (cid:80) ki =1 X i , with the convention that S = 0 .Finally, we let F = σ ( M, N − , N + , S N − , S N + ) and G = σ ( M, N − , N + , S M , S N − , S N + ) .Note that G = F ∨ σ ( S M ) .We start our construction by setting:(6) A = S M . JEAN BÉRARD AND NICOLAS JUILLET
Next, we specify B by the requirement that, conditional upon G , the distri-bution of B is:(7) α S N − ,S N + ( A ) · δ S N − + β S N − ,S N + ( A ) · δ S N + . Note that α S N − ,S N + ( A ) and β S N − ,S N + ( A ) do indeed lie in the interval [0 , thanks to the assumption that the random variables X i are nonnegative, sothat S N − (cid:54) A = S M (cid:54) S N + , since N − (cid:54) M (cid:54) N + .We now proceed to checking that all three properties listed in (2) aresatisfied by the random variables A and B . From (6), it is immediate that A has the required distribution. Moreover, from the definition of α and β ,one has the identity α S N − ,S N + ( A ) · S N − + β S N − ,S N + ( A ) · S N + = A, which rewrites as: E [ B |G ] = A a.s. , whence, taking the conditional expectation E ( ·| σ ( A )) on both sides, andusing the fact that σ ( A ) ⊂ G since A = S M is G− measurable, E [ B | σ ( A )] = A a.s. , as required by (2).To conclude the proof, it remains to check that B d = S N . By construction,the conditional distribution of B given F can be written as: E (cid:104) α S N − ,S N + ( A ) (cid:12)(cid:12)(cid:12) F (cid:105) · δ S N − + E (cid:104) β S N − ,S N + ( A ) (cid:12)(cid:12)(cid:12) F (cid:105) · δ S N + . Now observe that, by symmetry, given integers n − (cid:54) m (cid:54) n + such that n − < n + , we have: E (cid:104) S m − S n − (cid:12)(cid:12)(cid:12) σ ( S n − , S n + ) (cid:105) = m − n − n + − n − ( S n + − S n − ) a.s. , from which we deduce that E (cid:104) α S n − ,S n + ( S m ) (cid:12)(cid:12)(cid:12) σ ( S n − , S n + ) (cid:105) = α n − ,n + ( m ) a.s.and E (cid:104) β S n − ,S n + ( S m ) (cid:12)(cid:12)(cid:12) σ ( S n − , S n + ) (cid:105) = β n − ,n + ( m ) a.s.As a consequence, the conditional distribution of B given F is none but: α N − ,N + ( M ) · δ S N − + β N − ,N + ( M ) · δ S N + . From the fact that the sequence X = ( X i ) i (cid:62) is independent from ( N − , M, N + ) ,and from condition (5), we deduce that B d = S N , which concludes the proof. In the case where n − = m = n + , these identities are still (obviously) valid. COUPLING PROOF OF CONVEX ORDERING FOR COMPOUND DISTRIBUTIONS 7 Final remarks and extensions
Continuous time.
We note that the coupling construction describedin Section 3 can be extended in continuous time. For instance, let N =( N t ) t (cid:62) be a standard Poisson process and S ≺ cx T nonnegative integrablerandom variables independent from N . Then one has N S ≺ cx N T , and it isstraightforward to extend our approach to define a pair of random variables ( A, B ) such that(8) A d = N S , B d = N T , and A = E ( B | σ ( A )) a.s.The same approach still works in exactly the same way if we consider anintegrable subordinator instead of a Poisson process.4.2. Exchangeable random variables.
If the sequence of random vari-ables ( X i ) i (cid:62) is assumed to be exchangeable instead of i.i.d., the couplingconstruction described in Section 3 works in exactly the same way. (Theclassical proof found in [17, 13] also works in this case.) Note that, in thecase of an infinite exchangeable sequence of random variables, one can di-rectly deduce (1) from the i.i.d. case, using the De Finetti representation ofsuch a sequence as a mixture of (distributions of) i.i.d. sequences, and thecharacterization of (1) through the inequality(9) E f ( X + · · · + X M ) (cid:54) E f ( X + · · · + X N ) for all convex functions f . On the other hand, if M and N are assumed tohave finite support, say { , , . . . , q } , and one considers a finite exchange-able sequence of random variables X , . . . , X q , the extension of De Finetti’stheorem to this case (see [12, 7]) leads in general to a signed mixture ofi.i.d. sequences, so one cannot integrate the inequality (9) with respect tothe mixing measure in order to directly deduce (1).4.3. Increasing convex ordering.
Assume that a comparison between M and N holds with respect to the increasing convex ordering: M ≺ icx N . Wethen have the following modified version of (1):(10) X + · · · + X M ≺ icx X + · · · + X N . To deduce (10) from (1), we note that the comparison M ≺ icx N impliesthat there exists an integer-valued random variable N such that M ≺ st N and N ≺ cx N , where ≺ st denotes the usual stochastic ordering. Given an The existence of a random variable N such that M ≺ st N and N ≺ cx N is aclassical and easily proved decomposition result for the increasing convex order. That N can, in addition, be chosen to be integer-valued is less standard. One possible proof ofthis fact is that, when µ ≺ icx ν , there exists a kernel k : R → P ( R ) such that µ almostsurely k u = k ( u ) has mean (cid:62) u and it holds µ · k = ν . In turn, k u can be written asa mixture k u = α u k u + (1 − α u ) k u , where µ almost surely α u ∈ [0 , , k u has mean u ,and the support of k u is contained in [ u, + ∞ [ . For k (cid:48) ( u ) = α u δ u + (1 − α u ) k u we have µ ≺ st ( µ · k (cid:48) ) ≺ cx ν and a coupling of the corresponding random variables can also beeasily deduced. Alternatively for another proof one can consider Kellerer’s kernels definedin [11, §2.1] for connecting µ ≺ icx ν . JEAN BÉRARD AND NICOLAS JUILLET i.i.d. sequence X = ( X i ) i (cid:62) independent from M, N , N , the fact that the X i s are nonnegative random variables, combined with M ≺ st N , yields thecomparison X + · · · + X M ≺ st X + · · · + X N . Then, using (1), we deducethat X + · · · + X N ≺ cx X + · · · + X N , so that (10) is proved.Now consider two sequences X = ( X i ) i (cid:62) and Y = ( Y i ) i (cid:62) of i.i.d. non-negative random variables, and, in addition to M ≺ icx N , assume that X i ≺ icx Y i . We then have the following extension of (10):(11) X + · · · + X M ≺ icx Y + · · · + Y N , which is a straightforward consequence of (10) and of the fact that the in-creasing convex ordering is preserved by convolution.4.4. Counterexample to (1) when the X i are not positive randomvariables. We let M ≡ and N ∼ ( δ + δ ) , so that the condition M ≺ cx N is satisfied. Then let X , X be i.i.d. random variables, independent from N (and M ), with X i ∼ ( δ − + δ ) . We find that X M ∼ ( δ − + δ ) , while X N ∼ ( δ − + 6 δ + δ ) . Hence E ( | X N | ) < E ( | X M | ) = so that X M ≺ cx X N cannot hold. References [1] Mathias Beiglböck and Nicolas Juillet. On a problem of optimal transport undermarginal martingale constraints.
Ann. Probab. , 44(1):42–106, 2016.[2] Mathias Beiglböck and Nicolas Juillet. Shadow couplings.
ArXiv e-prints , pagearXiv:1609.03340, September 2016.[3] Gustave Choquet. Existence des représentations intégrales au moyen des points ex-trémaux dans les cônes convexes.
C. R. Acad. Sci. Paris , 243:699–702, 1956.[4] R. G. Douglas. On extremal measures and subspace density.
Michigan Math. J. ,11:243–246, 1964.[5] Rick Durrett.
Probability—theory and examples , volume 49 of
Cambridge Series inStatistical and Probabilistic Mathematics . Cambridge University Press, Cambridge,2019. Fifth edition.[6] D. Hobson and A. Neuberger. Robust bounds for forward start options.
Math. Fi-nance , 22(1):31–56, 2012.[7] Svante Janson, Takis Konstantopoulos, and Linglong Yuan. On a representation the-orem for finitely exchangeable random vectors.
J. Math. Anal. Appl. , 442(2):703–714,2016.[8] Benjamin Jourdain and William Margheriti. A new family of one dimensional mar-tingale couplings. arXiv e-prints , page arXiv:1808.01390, Aug 2018.[9] Nicolas Juillet. Peacocks parametrised by a partially ordered set. In
Séminaire deProbabilités XLVIII , volume 2168 of
Lecture Notes in Math. , pages 13–32. Springer,Cham, 2016.[10] Rob Kaas, Marc Goovaerts, Jan Dhaene, and Michel Denuit.
Modern actuarial risktheory: using R , volume 128. Springer Science & Business Media, 2008.[11] Hans G. Kellerer. Markov-Komposition und eine Anwendung auf Martingale.
Math.Ann. , 198:99–122, 1972.[12] G. Jay Kerns and Gábor J. Székely. De Finetti’s theorem for abstract finite exchange-able sequences.
J. Theoret. Probab. , 19(3):589–608, 2006.[13] Alfred Müller and Dietrich Stoyan.
Comparison methods for stochastic models andrisks . Wiley Series in Probability and Statistics. John Wiley & Sons, Ltd., Chichester,2002.
COUPLING PROOF OF CONVEX ORDERING FOR COMPOUND DISTRIBUTIONS 9 l ll l l ll lll l ll ll l ll ll llll l llll ll lll l ll ll ll lll lll l lll ll ll ll ll ll lll ll ll ll ll ll ll ll lll ll l ll ll l ll l lll l ll ll ll lll lll ll ll lll ll lll l llll ll ll l ll l l ll ll ll llll ll l ll l ll lll ll l lll ll ll l ll lll l lll lll l ll l ll l ll l l ll ll l lll llll llllll l lllll l ll lll ll ll ll lll l llll l lll lll l lll ll lll ll ll ll l ll ll ll ll l llll llll ll l lllll ll l l ll l ll ll lll ll ll ll llll ll l l ll ll l ll lll ll lll l lll l ll ll lll ll l ll ll ll ll llll l ll l ll l ll l ll ll l lllll llll ll ll ll ll lll ll l lllll lll l lll ll l lll ll lll ll llll l l ll ll l ll lll ll lll lll llll llllll l l ll l l ll l ll ll l l llll l ll llll lll lll lll lll lll lllll lllll l lll lll ll ll llll ll l llll l l llll l llll lll llll ll l lll l ll ll ll ll ll l ll l ll ll l ll l ll l llll ll l lll ll lll l l lll l llll llll lll l lll l lll l ll l lll lll llll l lll lll lll l ll ll ll l lll llll ll lll l llll lll l ll l lll lll ll ll ll ll l llll ll lll l l lll ll l ll l l llll ll ll lll ll ll lll ll lll ll l l ll ll lllll llllll ll l lll ll l l lll l l lll l lllll ll ll ll lll l ll ll l ll l lllll l l ll ll lll ll l lll lll lll ll l llll ll l llll ll ll lll lll ll ll l lll lll ll ll l lll l l ll ll lll l ll l lll ll lll l lll ll lll l l ll l ll l lll l lll lll ll l l llllll lll ll ll ll lll ll llll lll llll l ll ll l llllll l l ll ll llll ll l lll ll ll l l ll l ll ll ll ll l ll ll lll lll l lll llll ll l llll l l lll ll lllll ll l lll ll ll lll ll llll lll l ll ll ll ll l ll ll lllll l lllll l lll ll ll lll ll ll l l lll ll l lll l lllll lll ll ll l ll l l l ll ll llll ll ll ll llllll l ll ll ll l l ll ll l ll l l ll lllll ll ll llll ll l ll lll l lll lll lll ll lll l l lll l ll ll lll l ll l ll ll lll ll ll ll ll l ll ll ll ll lll ll l ll l llll lll ll l ll l ll l l l ll ll ll l ll lll ll ll l llll ll l ll ll ll ll l ll l ll lll l ll ll lll l lll lll ll lll ll l l ll ll ll ll l ll l ll l ll ll ll l ll lll ll lllll l ll ll l ll l ll lll l lll l ll ll ll lll ll llll llll ll ll ll l ll l ll l ll llll l ll ll ll ll ll lll lll lll lll ll ll ll ll lll l llll ll l ll l lll ll ll ll lll l l ll l l ll l l l ll l l ll lll ll l llll ll l ll l lll l llll l ll l ll lll ll llll l l lll l lll ll lll ll llll ll lll l lll lll ll l ll ll ll ll lll ll lll l ll ll lll lll ll lll ll ll l ll l l ll ll lll ll l ll ll ll ll l lll lll l ll l lll l ll ll l l ll ll ll ll l ll ll ll ll ll l l ll ll l ll lll ll l lll l ll ll ll lll l ll ll llll ll lll l ll ll ll ll ll l lll llll l lll lll l ll l ll l l lll l llll ll ll l l ll l l ll lll ll ll lll ll lll l l l ll l ll llll lll l ll lll ll lll l llll llll l lll l l l ll lll ll ll ll ll ll ll lll ll ll l l l ll ll l l ll lll l ll ll l l ll l ll ll ll l ll l llll ll ll l l llll l l ll lll ll l l lll ll lll ll ll ll ll l ll ll ll l llll ll ll lllll ll ll l llll ll ll lllll l ll l ll ll ll ll l ll lll ll ll llll lll ll llll ll l lllll llll l llll ll l llll lll l l ll ll ll lll llll lll ll ll ll lll l l lll ll l ll ll llll l ll ll lll lll l lll ll l l ll lll l lll ll l lll l lll lll l ll lll l lll ll l lll ll l llll l lll ll ll l lll lll lll lll l lllll ll ll ll lll llll ll lll lll l ll llll l llll lll lll l llll lll l l ll l ll lll ll lll l lll ll ll ll l ll l ll lll lll lll l ll ll l ll lllll l lll ll l llll l ll ll l lll lll ll l ll ll ll ll ll l ll l lll l l lll ll ll ll l l ll ll l ll ll ll ll l lll ll llll ll l ll l lll l l ll l ll ll ll ll l ll ll ll ll ll ll l lllll lll ll ll ll llll l l lll l l ll ll ll lll l lll ll ll l ll l lll ll ll llll ll l l l lll l llll l ll lll lll ll ll ll ll lll l ll l l l lll ll l lll l lll l l llll lll l ll ll ll l ll l ll ll l ll ll llll l l lll lll l ll ll ll l lll l ll l ll l ll lllll ll ll lll l ll l ll l l l l lll ll ll l ll ll l lll ll ll ll l l ll ll ll lll ll ll llll lll l ll ll l ll llll l llll ll l l lll l ll lll l llll l ll ll ll lll l ll ll ll ll l l lll l lll l llll l l l ll l ll lll l l ll lll l lll lll ll l l ll ll l l lll l ll l llll l lll llll ll lll l l ll llll ll ll ll l ll ll ll ll l lll ll l l ll l lll ll lll lll ll l ll ll l lll l ll l llll lll l l lll l ll l l lll ll ll l llll ll l ll l l ll llll lll lll ll ll ll lll l lll ll lll lll l lll ll ll ll lll ll lll l lll ll l llll ll ll ll ll l ll lll l ll ll l ll l l l lll llll ll l ll ll l l llll l ll ll l l lllll lll l lll ll l ll l llll l llll ll lll ll lll ll l l ll ll l l lll ll lll l ll ll l l ll lll l lllll lll l ll l l ll ll l llllll ll l ll ll llll llll l ll ll l llll ll ll lll lll l l lll l ll ll lll lll ll llll l ll l l ll ll ll llll llll l llll l ll ll ll lll ll lll l l lll ll ll l ll l l l l ll ll llll llll lll ll ll ll ll l ll ll lll l ll lll l ll llllll lll l ll l l ll lll lllll ll l ll ll ll lll ll lll lll l ll ll l l l llll ll ll l l lll lll lll lll ll ll ll ll l ll ll l ll ll lll l ll ll l ll l ll ll l ll lll lll lll l l lll ll ll ll ll lll l lll lll l ll lll l lll llll ll ll l lll llll l lll ll ll ll l ll l ll llll l ll lll l ll lll l lll ll ll ll lll lll lll l ll llll l ll ll llll ll l llll ll ll l ll ll l ll ll l ll ll l l l ll l lll ll ll lll l ll ll ll ll ll ll lll llll ll l ll l ll lll ll l lll llll lll llll lll ll ll lll l l llllll l ll l lll lll lll ll lll l ll ll l l l llll ll ll llll l lllll l ll ll ll lll lll ll ll lllll ll l l lll llll ll ll ll lll ll llll l ll ll ll lll l ll ll l ll llll ll lll ll l l ll l ll l lll lll l l lll ll llll ll ll ll l ll ll lll lll ll ll ll lll lll ll ll l lll ll l lll ll l ll ll l ll ll ll lll ll ll l ll l l ll l ll lll l l ll l ll l ll ll l ll lll ll llll ll lll ll l ll ll ll llll ll lll ll ll ll lll l ll lll ll ll l l l lll l lll ll ll ll ll l ll ll l ll l ll ll llll l ll ll ll ll ll llll lll l lll l lll ll lll l ll ll ll ll ll ll l ll ll l ll lll ll l l ll ll lll l ll lll lll lll l ll l ll lll llll lll lll l l ll ll lll l ll ll ll l ll ll l lll l ll lll ll ll ll lll l lll l ll l ll ll l l lll l llll l ll l llll ll llll ll llll lll ll l ll l ll ll l ll lll ll ll l lll l ll ll ll ll ll l ll l ll ll ll ll ll l l lll ll ll ll lll l lll llll llll ll lll lll l ll lll lll ll ll l llll lll ll l l lll l llll lll l l lllll ll ll l lll l lll l lll lll ll l lll lllll l l ll ll ll l l l lll l ll lll ll ll l ll llll l lll ll l lll lll l lll ll ll l lll l ll lll lll ll l lll lll ll ll l lllll l ll lll l l l l llll lll ll lll l ll l ll ll lll l ll ll ll ll lll l lll l ll ll l ll ll lll ll l l l ll ll lll l ll lll ll ll ll ll ll l ll l ll ll ll l l ll l ll ll lll ll ll ll ll l lll ll ll ll ll lll ll l ll ll l ll ll lll l lll llll l ll ll lll ll ll l lll l ll l l llll lll l l l lll ll ll llll l ll ll ll llll llll l ll l l llll lll ll ll l ll lll l l lll ll ll ll l ll ll ll lllll l lll l l ll l ll ll l llll l l llll l ll ll ll ll ll l ll lll ll l l ll lll l l l llll ll ll l llll l lll ll ll ll lll ll l l ll lll ll l lll l ll ll ll lll l ll l ll l lll ll ll l ll l lll ll l ll ll ll ll l ll ll ll lll ll llll l lll l ll ll lll ll ll ll l l ll ll ll l l llll ll ll ll ll l l l llll ll llll l l ll l llll ll l ll ll ll l llll ll ll l l ll l l llll ll lll ll l lll l ll lll ll llll ll ll l ll ll l ll l ll l l lll lll l llll l l ll l ll ll lll ll l l ll llllll l ll llll l l l lll lll llll l ll lll l ll l ll lll l ll ll ll ll l ll l lll lll l llll ll lll l ll lll lll lll ll lll ll ll ll l lll lll l l lll lll l ll ll ll ll ll llll ll l l ll l ll ll llll ll ll lll lll ll lll ll ll l ll ll l ll lll l lll lll ll ll ll l ll l ll llll l ll ll ll ll lll ll l l lll llll ll lll ll lll ll l ll ll l ll ll l lll ll ll ll ll llll ll llllll l lll ll lllll lll llll lllll l l lll ll lll l lll l l ll llll l l ll llll lll lll ll ll lll l llll l llll l l lll lll llll l lll l llll l ll ll ll lll ll llll llll llll l ll l llll ll lll ll llll ll lll ll ll ll l lll ll l lll l lll ll lll ll ll l ll ll ll lll ll l lll l l lll ll ll llll l l ll ll lll l l ll ll l lll l l lllll ll l lll ll ll l lll ll lll ll ll ll ll l lll l llll ll lll ll lll lll lll ll l l ll l ll ll ll l ll ll ll lll lll l l l ll l ll ll ll llll ll l l lll ll lll ll lll l lll l l llll ll ll ll ll l lllll ll l ll lllllll l lll l lll l llll llll ll l ll lll l llll ll ll lll ll lll ll ll ll l l ll ll lll l l l l ll llll l l lll lll l l lll l ll l ll l l l lllll l l ll l lll ll l ll l ll ll l ll ll l lll lll ll l lll ll lll lll ll l lll l ll l ll ll l l ll ll ll l llll ll ll ll lll l lll ll ll l llll l l ll l llll ll ll ll ll ll l ll l ll l ll ll l llll l ll l l ll l l lll l l ll lll l lll ll ll l lll ll l ll ll ll lll ll lll ll llll lll ll ll ll l ll ll ll ll ll ll l ll ll lll l lll l lll ll llll ll ll l ll ll lll lll ll l lll l l lll ll lll l l ll l ll l l lll lll ll ll ll ll lll l ll ll ll ll ll ll lll l llll ll lll l lll ll ll lll ll ll l llll l lll ll ll ll ll l ll ll ll ll ll ll l lll l lll ll l l l lll lll ll ll ll ll ll lll lll ll ll ll ll llll ll l lll ll lll lll lll ll l lll l lll ll lll lllll l ll lll l ll l lll l ll l ll ll l ll lll l llll l ll ll ll ll l l ll ll lll ll l l l l ll lll ll l ll l l llll l lll ll ll ll lll l lll ll l ll ll lll lll l ll l llll ll l lll ll ll l ll l lll l l llll ll ll lll ll ll lll ll ll l l ll lll ll lll ll ll l ll l lll lll l ll llll l ll llll l l lll lll l l ll ll ll ll l ll ll l ll ll ll lll llll lll ll ll lll llll l l lll lll l ll ll llllll ll l lll l ll ll ll lll l l lll ll l l l lll l ll ll llll l ll ll ll ll l llll ll ll lll l ll llll lll l ll l ll ll ll lll l l ll lll lll llll l lll lll l lll l ll lll l ll l ll llll ll lll l lll lll ll lll lll ll ll l lll l ll lll llll l ll ll l lll llll lll ll ll l lll l ll lll llll l ll ll ll l l l ll l lll ll ll llll llll l l llll ll l ll l ll ll l l ll l ll l l l ll ll lll ll ll lll l ll ll l ll ll l lll lll l ll lll ll l ll llll ll ll l lll l lll l l ll l l l lll ll llll lll l llll l llll ll ll ll llll l lll ll ll l l ll ll lll lll ll l l lll lll l ll lll ll lll l ll l ll l ll l l ll l lllll ll lll ll ll l lll ll l l l ll ll ll l lll lll l ll lll l ll lll llll ll l l lll lll l ll ll lllll lll ll l ll ll ll l lll ll l l ll l lll lll ll l ll ll ll ll llll ll l ll ll lll lll l ll lll lll ll l ll ll ll l l ll ll llll ll lll ll lll lll ll lll l ll ll lll ll ll l lll l lll lll ll ll ll l ll ll ll ll ll l lll ll l l l lll ll l ll ll ll ll l lll l lll l l lll lll l l ll ll ll lll ll lll llll lll lll l lll ll l llll lll l l lll ll ll ll l ll ll l l lll ll l ll l lll ll ll ll l l lll ll lll ll ll ll l ll l l lll ll l ll l ll l ll lll l llll l ll lll l llll lll ll lll ll l ll lll l ll ll l l llll l lll ll ll llll l lll ll l ll l ll ll lll l ll l ll ll l ll ll l lll ll llll ll ll l lll lll ll ll ll ll ll l ll l ll ll ll lll l l l ll ll ll l l lllll l l ll ll ll lll lll lll ll l l lll l ll lll l ll l l ll l llll lllll l ll lll lll lll lll ll l lllll ll lll l ll l lll lll llll l llll lll ll l ll l lllll lll ll l ll ll ll ll llll ll ll ll l ll ll l lll ll ll ll llll l l l ll l ll lll l ll l ll l l lll lll l lll l ll l l l ll ll ll ll lll ll ll llll ll ll l lll ll lll ll ll lll l ll ll ll l ll l lll ll lll l ll ll ll ll l lll llll l ll l l ll l lll ll ll ll ll ll l lll ll lll l llll ll ll lll ll ll lll l lll ll ll ll ll l l ll llll ll lll ll l lll lllll l l ll lll ll ll ll ll l ll ll l ll l lll llll lll ll ll l ll ll l l ll lll l ll ll ll l lll l lll llll l ll llll ll ll l ll ll ll llllll lll l ll ll ll lll l lllll ll l ll ll llll lll l l l ll lll l ll ll l ll lll llll l ll l ll llll l ll ll ll lll ll ll llll ll lll lll l ll ll lll ll llll l lll l llll ll ll llll l lll l lll l l l l lll l llll ll ll l ll ll l l ll ll lll l ll l ll lll llll l lll l l lll lll l ll ll ll ll lll l ll ll ll l lll ll l ll l ll ll ll l lll ll lll ll ll lll lll ll l lll l l l ll lll ll ll lll ll l ll lll ll lll l ll llll llll llll l ll ll ll ll ll lll ll l l l ll l ll ll ll ll lll l lll ll l ll lll lllll l ll ll l ll l ll ll ll l lll l l ll l l ll ll ll ll llll lll lll lllll l lll lll ll l lll lll ll ll llll l llll l l lll lll ll ll l ll llll l llll ll l ll l l ll ll l ll llll ll l lll ll ll ll l lll ll llllll llll l ll l lll lll ll ll ll l ll ll ll l l llll ll ll l l lll lll ll ll l ll l ll l ll lll l ll ll lllll l lll ll ll lll l lll l ll ll llll ll l l ll l ll ll ll ll llll l lllll ll lll l lll lll l ll ll ll l lll l l l lll ll ll l ll llll ll l lll ll l l lll lllll ll ll l l ll llll l lll ll ll lll l l lll l l ll l ll l llll ll l lll ll ll ll ll ll ll l l llll l ll lll l ll ll l lll l l l lll llll l l l l lll lll lll lll ll llll l ll l llll ll l lll ll ll ll l ll l lll l ll ll ll lll l ll ll ll l ll ll ll ll ll ll l llll ll lllll ll lll ll lll lll l ll ll l l llll lll ll lll ll l llll ll ll ll lll ll ll lll ll ll lll lllll l ll ll l ll l ll lll l llll l l ll ll ll l lllll ll ll ll l ll ll ll ll ll lll ll ll ll lll lll l lll l l lll ll ll l ll l ll l lll l lll ll lllll lll ll ll ll ll ll ll lll l ll ll l ll ll ll lll lll ll l lll l l ll ll ll ll l ll ll llll l ll l ll l lll lll lll lllll ll lll l l lll l ll ll ll ll ll l lll l ll l l ll llll l ll lll l l ll l ll l ll l l ll llll ll l lll ll l lll ll l ll lllll l lll l ll ll ll ll l lll l ll ll ll lll l ll l l lll ll llll l ll l ll ll ll ll ll l ll l ll lll ll l ll l lll ll l lll l ll l ll l llll l ll ll l lll lll l l llll l lll ll l ll l ll ll ll ll llll ll lll ll llll ll l ll ll llll ll l ll ll l ll l lll ll ll lll lll l l lll l llll l ll l ll ll l ll lll l ll l llll ll l ll ll ll ll l lll ll l ll l lllll llll llll lllll ll ll l lll lll ll ll ll ll ll lll ll l lll l lll l ll lll llll lllll l ll l ll l ll l ll l ll l lll l ll l ll l ll l l ll l lll lll ll ll lll lll l ll ll ll ll l ll l l ll ll lll ll ll lll ll l lll ll l llll l ll llll ll lll ll ll lll l lllll llll ll lll ll l ll lll llll l llll ll ll lll l l ll l llll ll l l ll l ll ll lll l lll l lllll lll lll lll l l lll lll l ll l lll ll l lll ll ll ll llll l lll ll ll ll llll ll ll lll lll ll ll l ll l ll lll lll l lllllll lll l l l ll ll l lll lll ll ll lll l ll l lllll l ll ll l ll ll llll l ll l lll l llll l ll ll lll llll ll ll lll l ll lll ll l ll lll ll ll llll l l lll l l lll ll lll l ll ll ll lll ll lll lll ll l l lll ll ll ll llllll lll lllll lll lll l ll lllll l ll ll ll l l ll l l ll lll ll l ll lll lll ll l ll ll ll lll ll lll ll lll ll lll lll l l l lll ll l l ll llll l lll ll l lll llll l ll l ll ll l ll l ll ll llll ll l ll lll l lllll l ll lll lll ll l ll lllll ll ll ll ll lll l lll l llll lll ll l llll lll l lll l ll lll lll l l lll ll llll ll lll lll l ll l lll ll l lll lll ll l ll lll ll ll lll ll ll ll lll ll l l ll lll l ll ll lll llll l l lll l l ll lll lll l lll l l llll l llll ll ll l l ll l ll ll ll lll lll ll llll lll l lll l llll ll llll ll l l lll lll lll ll l l ll lll l ll lll lll lll ll ll ll l ll l lll llllll l lll ll l l ll llll l ll l ll l ll lll ll l lll l ll l llll ll lllll l ll l l lll l lll ll l lll ll l l lll S M S N Figure 1.
The figure shows simulated pairs ( A, B ) , with µ = (cid:80) n =2 14 δ n and ν = δ + δ + (cid:80) n =4 16 δ n , Law ( X i ) = E (1) , and the algorithm of Section2.3 to produce the diatomic decomposition. [14] Marcel Nutz and Florian Stebegg. Canonical supermartingale couplings. Ann.Probab. , 46(6):3351–3398, 2018.[15] Robert R. Phelps.
Lectures on Choquet’s theorem , volume 1757 of
Lecture Notes inMathematics . Springer-Verlag, Berlin, second edition, 2001.[16] D. Revuz and M. Yor.
Continuous martingales and Brownian motion , volume 293 of
Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Math-ematical Sciences] . Springer-Verlag, Berlin, third edition, 1999.[17] Moshe Shaked and J. George Shanthikumar.
Stochastic orders . Springer Series inStatistics. Springer, New York, 2007.[18] V. Strassen. The existence of probability measures with given marginals.