Secret key agreement on wiretap channels with transmitter side information
aa r X i v : . [ c s . I T ] J un SECRET KEY AGREEMENT ON WIRETAP CHANNELS WITH TRANSMITTERSIDE INFORMATION
Ashish Khisti ∼ akhisti ABSTRACT
Secret-key agreement protocols over wiretap channels con-trolled by a state parameter are studied. The entire state se-quence is known (non-causally) to the sender but not to thereceiver and the eavesdropper. Upper and lower bounds onthe secret-key capacity are established both with and with-out public discussion. The proposed coding scheme involvesconstructing a codebook to create common reconstruction ofthe state sequence at the sender and the receiver and anothersecret-key codebook constructed by random binning. For thespecial case of Gaussian channels, with no public discussion,— the secret-key generation with dirty paper problem, thegap between our bounds is at-most 1/2 bit and the bounds co-incide in the high signal-to-noise ratio and high interference-to-noise ratio regimes. In the presence of public discussionour bounds coincide, yielding the capacity, when then thechannels of the receiver and the eavesdropper satisfy an in-dependent noise condition.
1. INTRODUCTION
Many applications in cryptography require that the legitimateterminals have shared secret keys, not available to unautho-rized parties. Information theoretic security encompasses thestudy of source and channel coding techniques to generatesecret-keys between legitimate terminals. In the channel cod-ing literature, an early work in this area is the wiretap chan-nel model [1]. It consists of three terminals one sender, onereceiver and one eavesdropper. The sender communicates tothe receiver and the eavesdropper over a discrete-memorylessbroadcast channel. A notion of equivocation-rate the nor-malized conditional entropy of the transmitted message giventhe observation at the eavesdropper, is introduced, and thetradeoff between information rate and equivocation rate isstudied. Perfect secrecy capacity, defined as the maximuminformation rate under the constraint that the equivocationrate approaches the information rate asymptotically in theblock length is of particular interest. Information transmit-ted at this rate can be naturally used as a shared secret-keybetween the sender and the receiver. In the source codingsetup [2, 3] the two terminals observe correlated source se-quences and use a public discussion channel for communica-tion. Any information sent over this channel is available to aneavesdropper. The terminals generate a common secret-keythat is concealed from the eavesdropper in the same sense asthe wiretap channel the equivocationIn the present paper we consider a secret-key agreementproblem when the sender and the receiver communicate overa channel controlled by a state parameter. The state parame-ter is known to the sender but not to the receiver or the eaves- dropper. The problem of transmitting information on suchchannels, without secrecy constraints, is studied in [4]. Arandom binning strategy is proposed and shown to achievethe capacity. Costa [5] studies the problem of communicat-ing over an additive noise Gaussian channel with an addi-tive interference sequence known to the transmitter and es-tablishes that there is no loss in capacity if the interferencesequence is known only to the transmitter. In the presentwork, we study the problem of generating a common secretkey between the sender and the receiver over such channels.Our proposed coding scheme is not based on the Gel’fand-Pinsker binning technique for sending an information mes-sage over such channels. Instead our codebook is designed tocreate a common reconstruction sequence at the sender andthe receiver and distilling a secret-key based on this commonsequence.In related works, the problem of secret-message trans-mission over wiretap channels controlled by a state parame-ter is studied in [6, 7]. In these works an achievable codingscheme is proposed that combines Gel’fand Pinsker codingand coding for the wiretap channel. As discussed earlier,our coding scheme is based on a different approach and ingeneral yields higher achievable rates. A related problem ofcommon reconstruction of state sequences has been studiedrecently in [8, 9]. The problem of secret-key agreement withsymmetric channel state information at the sender and thelegitimate receiver has been studied in [10]. However thecoding scheme involved is based on the fact that the termi-nals have knowledge of the common state sequence to be-gin with. After this paper was submitted, we learnt about arecent work [11] where the problem of communicating overover channels with non-causal CSI is used as a building blockfor characterizing the tradeoff between secret-key and secret-message transmission.
2. PROBLEM SETUP
As Fig. 1 illustrates, the channel model has three terminals— a sender, a receiver and an eavesdropper. The sendercommunicates with the other two terminals over a discrete-memoryless-channel with transition probability p y r , y e | x , s ( · ) where x denotes the channel input symbol, whereas y r and y e denote the channel output symbols at the receiver and theeavesdropper respectively. The symbol s denotes a state vari-able that controls the channel transition probability. We as-sume that it is sampled i.i.d. from a distribution p s in eachchannel use. Further, the entire sequence s n is known to thesender before the communication begins.In defining the secret-key capacity we separately con-sider the cases when a public discussion channel is and is hannelSender ReceiverEavesdropper state Public discussion channel
PSfrag replacements s kk Figure 1: Wiretap channel controlled by a state parameter. The channel transition probability p y r , y e | x , s () is controlled by a stateparameter s . The entire source sequence s n is known to the sender but not to the receiver or the eavesdropper. The sender andreceiver generate a secret key k at the end of the transmission.not present. A length n encoder is defined as follows. The sender sam-ples a random variables u from the conditional distribution p u | s n ( ·| s nr ) . The encoding function produces a channel inputsequence x n = f n ( u , s n ) and transmits it over n uses of thechannel. At time i the symbol x i is transmitted and the legit-imate receiver and the eavesdropper observe output symbols y ri and y ei respectively, sampled from the conditional distri-bution p y r , y e | x , s ( · ) . The sender and receiver compute secretkeys k = g n ( u , s n ) and l = h n ( y n r ) . A rate R is achievableif there exists a sequence of encoding functions such thatfor some sequence e n that vanishes as n → ¥ , we have thatPr ( k = l ) ≤ e n and n H ( k ) ≥ R − e n and1 n I ( k ; y n e ) ≤ e n . (1)The largest achievable rate is the secret-key capacity. When a public discussion channel is present, the describedprotocol follows closely the interactive communication pro-tocol in [3]. The sender transmits symbols x , . . . , x n at times0 < i < i < . . . < i n over the wiretap channel. At thesetimes the receiver and the eavesdropper observe symbols y r , . . . , y rn and y e , . . . , y en respectively. In the remainingtimes the sender and receiver exchange messages y t and f t where 1 ≤ t ≤ k . For convenience we let i n + = k +
1. Theeavesdropper observes both y t and f t .More specifically the sender and receiver sample randomvariables u and v from conditional distributions p u | s n ( ·| s nr ) and p v ( · ) and observe that v is independent of ( u , s n ) . • At times 0 < t < i , the sender generates f t = F t ( u , s n , y t − ) and the receiver generates y t = Y t ( v , f t − ) . These messages are exchangedover the public channel. • At times i j , 1 ≤ j ≤ n , the sender generates x j = X j ( u , s n , y i j − ) and sends it over the channel. The re-ceiver and eavesdropper observe y r , j ad y e , j respectively.For these times we set y i j = f i j = • For times i j < t < i j + , where 1 ≤ j ≤ n , the senderand receiver compute f t = F t ( u , s n , y t − ) and y t = Y t ( v , y j r , f t − ) respectively and exchange them over thepublic channel. • At time k +
1, the sender and receiver compute k = g n ( u , s n , y k ) and the receiver computes l = h n ( v , y n r , f k ) .We require that for some sequence e n that vanishes as n → ¥ , Pr ( k = l ) ≤ e n and1 n I ( k ; y n e , y k , f k ) ≤ e n . (2)The secret-key rate is defined as n H ( k ) and the largestachievable secret-key rate is the capacity.
3. MAIN RESULTS
Our main results are upper and lower bounds on the secret-key capacity, which coincide in some special cases. We againconsider the cases of no public discussion and public discus-sion separately.
We first provide an achievable rate (lower bound) on thesecret-key capacity.
Theorem 1
An achievable secret-key rate without publicdiscussion is R − = max p u , p x | s , u I ( u ; y r ) − I ( u ; y e ) , (3) where the maximization is over all auxiliary random vari-ables u that satisfy the Markov condition u → ( x , s ) → ( y r , y e ) and furthermore satisfy the constraint thatI ( u ; y r ) − I ( u ; s ) ≥ . (4)he intuition behind the coding scheme is as follows.Upon observing s n , the sender communicates the best pos-sible reproduction u n of the state sequence to the receiver.Now both the sender and the receiver observe a common se-quence u n . The set of all codewords u n is binned into 2 nR − bins and the bin-index is declared to be the secret key.We note that the lower bound can be easily extended tothe case of two-sided CSI. If the receiver observes anotherstate sequence s r , correlated with s according to a joint distri-bution p s , s r ( · , · ) then the achievable rate expression (3) holdsprovided that we augment the received symbol by ( y r , s r ) .Finally for the case of symmetric CSI i.e., when s r = s ,the constraint (4) is redundant as clearly I ( u ; y r , s ) − I ( u ; s ) ≥ R − = max p u , p x | s , u I ( u ; y r , s ) − I ( u ; y e ) is indeed the secret-key capacity as established in our earlierwork [10].Finally we note that the problem of secret-key agreementis different from the secret-message transmission problemconsidered in [6, 12, 7]. This is because the secret-key canbe an arbitrary function of the state sequence (known only tothe transmitter) whereas the secret-message needs to be inde-pendent function of the state sequence. For comparison, thebest known lower bound on the secret-message transmissionproblem is stated below. Proposition 1 [6, 12, 7] An achievable secret message ratefor wiretap channel with non-causal transmiter CSI isR ≤ max p u , p x | u , s I ( u ; y r ) − max ( I ( u ; s ) , I ( u ; y e )) . (5)We note that whenever the maximizing u satisfies, I ( u ; y e ) > I ( u ; s ) > I ( u ; y e ) , the secret-key rate (3) is strictlybetter than the secret-message rate (5).The following theorem develops an upper bound onsecret-key capacity that is amenable to numerical computa-tion. Theorem 2
The secret-key capacity in absence of public dis-cussion is upper bounded by C ≤ R + , whereR + = min p y r , y e | x ∈ P max p x | s I ( x , s ; y r | y e ) , (6) where P denotes all the joint distributions p ⋆ y r , y e | x , s that havethe same marginal distribution as the original channel. The intuition behind the upper bound is as follows. Wecreate a degraded channel by revealing the output of theeavesdropper to the legitimate receiver. We further assumea channel with two inputs ( x n , s n ) i.e., the state sequence s n is not arbitrary, but rather a part of the input codeword withdistribution p s r . The secrecy capacity of the resulting wiretapchannel is then given by I ( x , s ; y r | y e ) .Our proposed upper and lower bounds coincide, yieldingcapacity in some special cases. We present one such case insection 3.3. In this section we provide lower and upper bounds on thesecret-key capacity with public discussion. We first providea lower bound below.
Theorem 3
An achievable secret-key rate with public dis-cussion is:R − disc = max (cid:18) max p x | s I ( x , s ; y r ) − I ( y e ; y r ) , R − (cid:19) (7) where R − is the lower bound attained without public discus-sion in Theorem 1 The achievability scheme involves a natural modificationof Maurer’s coding scheme [3, 2] to incorporate the pres-ence of the state parameter and involves a single round ofdiscussion. In particular, the sender generate a sequence x n according to the conditional distribution p x | s ( x | s ) and trans-mits over n channel uses. At the end of the transmission,the receiver sends the bin index of y n r , so that the sender canrecover this sequence given ( x n , s nr ) .Next we provide an upper bound on the secret-key capac-ity under public discussion. Theorem 4
An upper bound on the secret-key capacity isR + = max p x | s I ( x , s ; y r | y e ) . (8)We note that the upper bound expression (8) is similar to theupper bound expression in (6) except that we cannot min-imize over the joint-probability distribution in (8). This isbecause the public discussion channel provides a mechanismfor feedback and hence the capacity does depend on the jointdistribution (not just the marginal distributions). The prooffor the upper bound expression in Theorem 4 also signifi-cantly more elaborate as it accounts for public discussion.We note that if the channel additionally satisfies y r → ( x , s ) → y e then the upper and lower bounds in Theorem 3and 2 coincide. In particular if p x | s is the maximizing distri-bution in (8), we have that R − disc ≥ I ( x , s ; y r ) − I ( y e ; y r )= I ( y e , x , s ; y r ) − I ( y e ; y r ) − I ( y r ; y e | x , s )= I ( y e , x , s ; y r ) − I ( y e ; y r )= I ( x , s ; y r | y e ) = R + disc . Since R − disc ≤ R + disc , it follows that the two expressions mustbe equal. This is summarized in the result below. Theorem 5
The secret-key capacity with public discussionfor a DMC channel that satisfies y r → ( x , s ) → y e is given byC disc = max p x | s I ( x , s ; y r | y e ) . (9) We now study the Gaussian special case under an averagepower constraint. The channel to the legitimate receiver andthe eavesdropper is expressed as: y r = x + s + z r y e = x + s + z e , (10)here z r ∼ N ( , ) and z e ∼ N ( , + D ) denote the ad-ditive white Gaussian nose and are assumed to be sampledindependently. The state parameter s ∼ N ( , Q ) is also sam-pled i.i.d. at each time instance and is independent of both z r and z e . Furthermore, the channel input satisfies an averagepower constraint E [ x ] ≤ P . As the title indicates, we callthis setup, secret sharing with dirty paper.Thus the parameter P denotes the signal-to-noise ra-tio, the parameter Q denotes the interference-to-noise-ratio,whereas D denotes the degradation level of the eavesdropper.We now provide lower and upper bounds on the secret-keycapacity with and without public discussion. For simplicityin exposition we limit our analysis to the case when P ≥ . Proposition 2
Assuming that P ≥ , a lower bound on thesecret-key agreement capacity is capacity is given by,R − =
12 log (cid:18) + D ( P + Q + r √ PQ ) P + Q + + D + r √ PQ (cid:19) , (11) where r < is the largest value that satisfiesP ( − r ) ≥ − P + Q + . (12) Proposition 3
In absence of public discussion, an upperbound on the secret-key capacity is given by,R + =
12 log (cid:18) + D ( P + Q + √ PQ ) P + Q + + D + √ PQ (cid:19) (13) It can be readily verified that the upper and lower boundscoincide in several asymptotic regimes.
Proposition 4
The upper and lower bounds on secret-capacity without public discussion satisfying the following ∀ P ≥ , R + − R − ≤
12 (14)lim P → ¥ R + − R − = Q → ¥ R + − R − = Proposition 5
In the presence of public discussion, thesecret-key capacity is given by the following expression,R + =
12 log (cid:18) + ( + D )( P + Q + √ PQ ) P + Q + + D + √ PQ (cid:19) (17)
4. WITHOUT PUBLIC DISCUSSION
In this section we provide the coding scheme and the upperbound for the case when there is no public discussion.
A sequence of length n code is described as follows.4.1.1 Codebook Generation • Generate a total of n ( I ( u ; y e ) − e n ) sequences. Each se-quence is sampled i.i.d. from a distribution p u ( · ) . • Select a rate R = I ( u ; y r ) − I ( u ; y e ) − e n and randomlypartition the set sequences in the previous step into nR bins so that there are n ( I ( u ; y r ) − e n ) sequences in each bin. 4.1.2 Encoding • Given a state sequence s n the encoder selects a sequence u n randomly from the list of all possible sequences thatare jointly typical with s n . • At time i = , , . . . , n the encoder transmits symbol x i generated by sampling the distribution p x | u , s ( ·| u i , s i ) .4.1.3 Secret-key generation • The decoder upon observing y n r finds a sequence u n jointly typical with y n r . • Both encoder and the decoder declare the bin-index of u n to be the secret-key.4.1.4 Secrecy AnalysisWe need to show that for the proposed encoder and decoder,the equivocation at the eavesdropper satisfies n H ( k | y n e ) = I ( u ; y r ) − I ( u ; y e ) + o n ( ) , (18) where o n ( ) is a term that goes to zero as n → ¥ .Accordingly note that n H ( k | y n e ) = n H ( k , u n | y n e ) − n H ( u n | y n e , k )= n H ( u n | y n e ) − n H ( u n | y n e , k )= n H ( u n | y n e ) − e n where the last step follows from the fact that there are at-most n ( I ( u ; y e ) − o n ( )) sequences in each bin and hence the eaves-dropper can decode the codeword u n given the key k . It re-mains to lower-bound the first conditional entropy term. n H ( u n | y n e ) = n H ( u n ) + n H ( y n e | u n ) − n H ( y n e ) (19) = n H ( u n ) + n H ( y n e | u n , s n ) − n H ( y n e ) + n I ( s n ; y n e | u n ) (20) We now appropriately bound each term in (20) . First notethat since the sequence u n is uniformly distributed amongthe set of all possible codeword sequences, it follows that n H ( u n ) = n log | C | − e n = I ( u ; y r ) − e n (21) Next, given ( u n , s n ) , as verified below, the channel to the
1s x p Transmitted PointReceived Point
Uncertainty sphereat eavesdropper x d x PQQP Figure 2: Secret-key agreement codebook for the dirty paper channel. The transmitter signal x n is selected so that u n = x n + s n is a sequence in the random codebook. The legitimate receiver can decode u n (with high probability) and map it to the secret-key. The eavesdropper’s noise-uncertainity sphere includes are possible key values. Note that unlike the traditional dirty-papercode, the transmiter signal x n has a component along s n . The achievable rate, does depend on the interference power and henceit is beneficial to amplify it using part of the transmit power. Also note that unlike a dirty-paper code we do not scale down s n before quantizing but use a = eavesdropper is memoryless:p y n e | u n , s n ( y ne | u n , s n )= (cid:229) x n ∈ X n p y n e | u n , s n , x n ( y ne | u n , s n , x n ) p ( x n | u n , s n )( x n | u n , s n )= (cid:229) x n ∈ X n n (cid:213) i = p y e | u , s , x ( y e , i | u i , s i , x i ) p x | u , s ( x i | u i , s i )= n (cid:213) i = (cid:229) x i ∈ X p y e | u , s , x ( y e , i | u i , s i , x i ) p x | u , s ( x i | u i , s i )= n (cid:213) i = p y e | u , s ( y e , i | u i , s i ) The second step above follows from the fact that the channelis memoryless and the symbol x i at time i is generated as afunction of ( u i , s i ) . Hence we have that n H ( y n e | s n , u n ) = n (cid:229) i = H ( y e , i | s n , u n , y i − e , ) (22) = n (cid:229) i = H ( y e , i | s i , u i ) (23) Furthermore note that n H ( y n e ) ≤ n (cid:229) i = H ( y ei ) . (24) Finally, in order to lower bound the term I ( s n ; y n e | u n ) welet J to be a random variable which equals 1 if ( s n , u n ) arejointly typical. Note that Pr ( J = ) = − o n ( ) . n I ( s n ; y n e | u n ) = n H ( s n | u n ) − n H ( s m | u n , y n e ) ≥ n H ( s n | u n , J = ) Pr ( J = ) − n H ( s n | u n , y n e ) ≥ n H ( s n | u n , J = ) − n H ( s n | u n , y n e ) − o n ( ) ≥ H ( s | u ) − n H ( s n | u n , y n e ) − o n ( ) (25) ≥ H ( s | u ) − n n (cid:229) i = H ( s i | u i , y e , i ) − o n ( ) (26) where (25) follows from the fact that s n is an i.i.d. sequenceand hence conditioned on the fact that ( s n , u n ) is a pair oftypical sequence there are nH ( s | u ) − no n ( ) possible sequences s n . Substituting (21) , (23) , (24) and (26) in the lowerbound (20) and using the fact that as n → ¥ , the summation SNR (dB) R a t e No Discussion, Upper BoundNo Discussion, Lower BoundDiscussion Capacity −10 −8 −6 −4 −2 0 2 4 6 8 1000.20.40.60.811.21.4 D (dB) R a t e Upper Bound (no discussion)Capacity (public discussion)Lower Bound (no discussion)
Figure 3: Bounds on the capacity of the “secret-sharing with dirty paper” channel with and without public discussion. In theleft figure, we plot the capacity as a function of SNR (dB) when Q =
10 and D =
10. The upper-most curve is the capacitywith public-discussion whereas the other two curves denote the upper and lower bounds without discussion. In the right figurewe plot the capacity with public discussion as a function of D (in dB) when P =
10 dB and Q =
10 dB as well as the upperand lower bounds without public discussion. converges to the mean values, n H ( k | y n e )= I ( u ; y r ) + H ( y e | u , s ) − H ( y e ) + H ( s | u ) − H ( s | u , y e ) − o n ( )= I ( u ; y r ) − I ( y e ; s | u ) − I ( y e ; u ) + I ( y e ; s | u ) − o n ( )= I ( u ; y r ) − I ( y e ; u ) − o n ( ) as required. A sequence of length-n code satisfies: n H ( k | y n r ) ≤ e n (27)1 n H ( k | y n e ) ≥ n H ( k ) − e n (28) where (27) follows from the Fano’s Lemma since the receiveris able to recover the secret-key k given y n r and (28) is aconsequence of the secrecy constraint. Furthermore, notethat k → ( x n , s n ) → ( y n r , y n e ) holds as the encoder generatesthe secret key k . Thus we can bound the rate R = n H ( k ) asbelow: nR ≤ I ( k ; y n r | y n e ) + n e n ≤ I ( k , s n , x n ; y n r | y n e ) + n e n = I ( s n , x n ; y n r | y n e ) + n e n = n (cid:229) i = I ( s i , x i ; y r , i | y e , i ) + n e n ≤ nI ( x , s ; y r | y e ) + n e n where the last step follows from the concavity of the condi-tional entropy term I ( x , s ; y r | y e ) in the input distribution p x , s (see e.g., [13]).Finally since the secret-key capacity only depends on themarginal distribution of the channel and not on the joint dis-tribution we can minimize over all joint distributions withfixed marginal distributions.
5. WITH PUBLIC DISCUSSION
In this section we provide the proofs of the coding theoremand the converse for the case when there is a public discus-sion channel allowed.
Our coding scheme is closely related to the coding theoremfor the channel model in [3, 2] and emulates the generationof correlated source sequences. It consists of the followingsteps: • Fix a distribution p x | s . This induces a joint distributionp x , y r , y e , s . Let R = I ( y r ; x , s ) − I ( y r ; y e ) − e n • Partition the set of all typical sequences y n r into n ( H ( y r | x , s ) − o n ( )) bins. Furthermore partition the collec-tion of n ( I ( y r ; x , s )) sequences in each bin into further nR sequences so that there are n ( I ( y r ; y e ) − e n ) sequences ineach sub-bin. • Given symbol s i at time i, sample a symbol x i from theconditional distribution p x | s ( · ) and transmit it over thechannel. • The receiver upon observing y n r transmits the bin index ofthis sequence over the channel. Using the bin index andthe knowledge of ( x n , s n ) the sender reproduces y n r . • Both the sender and the receiver declare the sub-bin in-dex of y n r as the secret-key.Following the secrecy analysis in [3, 2] it can be shownthat this construction satisfies the secrecy constraint (2) andfurthermore attains a rate ofR = I ( y r ; x , s ) − I ( y r ; y e ) + o n ( ) . We now establish a corresponding upper bound on the secret-key capacity.First, using the fact that the receiver is able to recoverthe secret-key and the eavesdropper is subjected to a secrecyonstraint (2) , we have that n H ( k | y n r , v , F k ) ≤ e n (29)1 n I ( k ; y n e , Y k , F k ) ≤ e n (30)(31) Using the above relations and the fact that R = n H ( k ) , wenote thatnR ≤ H ( k ) ≤ I ( k ; y n r , v , F k ) − I ( k ; y n e , F k , Y k ) + n e n ≤ I ( k ; y n r , v , y n e , F k , Y k ) − I ( k ; y n e , F k , Y k ) + n e n ≤ I ( k ; y n r , v | y n e , F k , Y k ) + n e n ≤ I ( u , s n ; y n r , v | y n e , F k , Y k ) + n e n (32) = I ( u , s n ; y n r , v , y n e , F k , Y k ) − I ( u , s n ; y n e , F k , Y k ) + n e n = I ( u , s n ; v , F i − , Y i − )+ I ( u , s n ; y n r , y n e , F ki + , Y ki + | F i − , Y i − , v ) − I ( u , s n ; F i − , Y i − ) − I ( u , s n ; y n e , F ki + , Y ki + | Y i − , F i − )= I ( u , s n ; v | F i − , Y i − ) + n (cid:229) j = F r , j + G r , j − n (cid:229) j = F e , j + G e , j (33) where we have introducedF r , j = I ( u , s n ; y r , j , y e , j | v , f i j − , y i j − , y j − , y j − ) (34) G r , j = I ( u , s n ; y i j + − i j + , f i j + − i j + | v , f i j − , y i j − , y j r , y j e ) (35) F e , j = I ( u , s n ; y e , j | f i j − , y i j − , v ) (36) G e , j = I ( u , s n ; f i j + − i j + , y i j + − i j + | f i j − , y i j − , y j e ) (37) To complete the proof, it suffices to show that the follow-ing relations in (33) holdI ( u , s n ; v | F i − , Y i − ) = F r , j − F e , j ≤ I ( x j , s j ; y r , j | y e , j ) (39) G r , j − G e , j ≤ To establish (38) note that for ≤ k ≤ i − we have that F k = F k ( u , s n , Y k − ) and likewise Y k = Y k ( v , F k − ) . Usingwhich I ( u , s n ; v | F i − , Y i − ) ≤ I ( u F i − , s n ; v , Y i − | F i − , Y i − )= I ( u , s n ; v , F i − , Y i − ) Continuing this process we have thatI ( u , s n ; v | F i − , Y i − ) ≤ I ( u , s n ; v ) = , where the last relation follows from the fact that v is inde-pendent of ( u , s n ) . In order to establish (39) , we use (34) and (36) to get,F r , j − F e , j = I ( u , s n ; y r , j , y e , j | v , f i j − , y i j − , y j − , y j − ) − I ( u , s n ; y e , j | f i j − , y i j − , y j − )= H ( y r , j , y e , j | v , f i j − , y i j − , y j − , y j − ) − H ( y r , j , y e , j | v , f i j − , y i j − , y j − , y j − , u , s n ) − H ( y e , j | f i j − , y i j − , y j − )+ H ( y e , j | f i j − , y i j − , y j − , u , s n )= H ( y r , j , y e , j | v , f i j − , y i j − , y j − , y j − ) − H ( y r , j , y e , j | v , f i j − , y i j − , y j − , y j − , u , s n , x j ) − H ( y e , j | f i j − , y i j − , y j − )+ H ( y e , j | f i j − , y i j − , y j − , u , s n , x j ) (41) ≤ H ( y r , j , y e , j | f i j − , y i j − , y j − ) − H ( y r , j , y e , j | x j , s j ) − H ( y e , j | f i j − , y i j − , y j − ) + H ( y e , j | x j , s j ) (42) ≤ H ( y r , j | y e , j ) − H ( y r , j | y e , j , x j , s j ) = I ( s j , x j ; y r , j | y e , j ) (43) In the above steps (41) follows from the fact that x j = X j ( u , s n , y i j − ) and hence we can condition of x j i the sec-ond and fourth terms. Furthermore since the channel is mem-oryless ( y r , j , y e , j ) → ( x j , s j ) → ( v , u , s nj + , s j − , f i j − , y i j − , y j − , y j − ) holds.It remains to establish (40) . Using (35) and (37) we notethatG r , j − G e , j = I ( u , s n ; y i j + − i j + , f i j + − i j + | v , f i j − , y i j − , y j r , y j e ) − I ( u , s n ; f i j + − i j + , y i j + − i j + | f i j − , y i j − , y j e )= H ( u , s n | v , f i j − , y i j − , y j r , y j e ) − H ( u , s n | v , f i j + − , y i j + − , y j r , y j e ) − H ( u , s n | f i j − , y i j − , y j e ) + H ( u , s n | f i j + − , y i j + − , y j e )= I ( u , s n ; v , y j r | f i j + − , y i j + − , y j e ) − I ( u , s n ; v , y j r | y j e , f i j − , y i j − ) Since f i j + − = F i j + − ( u , s n , y i j + − ) and y i j + − = Y i j + − ( v , y j r , f i j + − ) we have thatI ( u , s n ; v , y j r | f i j + − , y i j + − , y j e ) ≤ I ( u , s n , f i j + − ; v , y j r , y i j + − | f i j + − , y i j + − , y j e )= I ( u , s n ; v , y j r | f i j + − , y i j + − , y j e ) and continuing this process we have thatI ( u , s n ; v , y j r | f i j + − , y i j + − , y j e ) ≤ I ( u , s n ; v , y j r | y j e , f i j − , y i j − ) as required.
6. GAUSSIAN CASE
In this section we develop the corresponding results for theGaussian case. .1 Proof of Prop. 2
The lower bound expression follows from Theorem 1 bychoosing x ∼ N ( , P ) to be a Gaussian random variableindependent of s and by choosing u = x + a s . In this case,R = I ( u ; y r ) − I ( u ; y e )= h ( u | y e ) − h ( u | y r ) Further evaluating each of the terms above with u = x + a s ,note thath ( u | y e ) =
12 log (cid:18) P + a Q + ar p PQ − ( P + a Q + ( + a ) r √ PQ ) P + Q + + D + r √ PQ (cid:19) andh ( u | y r ) =
12 log (cid:18) P + a Q + ar p PQ − ( P + a Q + r ( + a ) √ PQ ) P + Q + + √ PQ (cid:19) . This yields thatR =
12 log + D + PQ ( a − ) ( − r ) P + a Q + ra √ PQ +
12 log (cid:18) P + Q + + r √ PQP + Q + + D + r √ PQ (cid:19) . (44) Note that the first term in the expression above is maximizedwhen a = . As we show below, this choice is indeed feasiblewhen P ≥ . In particular the constraint (4) requires thath ( u | s ) ≥ h ( u | y r ) ⇒
12 log P ( − r ) ≥
12 log (cid:18) PQ ( a − ) ( − r ) + ( P + a Q ) + ra √ PQ ) P + Q + + r √ PQ (cid:19) . Substituting a = above we have thatP ( − r ) ≥ − P + Q + + r √ PQ ≥ − P + Q + as required. We evaluate the upper bound in Theorem 2 for the choice z e = z r + z d , where z d ∼ N ( , D ) is independent of z r .I ( s , x ; y r | y e ) = h ( y r | y e ) − h ( y r | y e , x , s )= h ( y r | y e ) − h ( z r | z e ≤
12 log (cid:18) P + Q + + p PQ − ( P + Q + + √ PQ ) P + Q + + D + √ PQ (cid:19) −−
12 log (cid:18) − + D (cid:19) where we have used the fact that the conditional entropyh ( y r | y e ) is maximized by a Gaussian distribution. The aboveexpression gives (13) . Since the Gaussian model satisfies the condition in Theo-rem 5, it suffices to evaluate C = I ( x , s ; y r | y e ) .I ( x , s ; y r | y e ) = h ( y r | y e ) − h ( y r | y e , x , s ) (46) = h ( y r | y e ) − h ( z r | z e ) (47) =
12 log 2 p e (cid:18) P + Q + + p PQ − ( P + Q + √ PQ ) P + Q + + √ PQ + D (cid:19) −
12 log 2 p e (48) which upon simplifying yields the desired expression. REFERENCES [1] A. D. Wyner, “The wiretap channel,”
Bell Syst. Tech.J. , vol. 54, pp. 1355–87, 1975.[2] U. M. Maurer, “Secret key agreement by public discus-sion from common information,”
IEEE Trans. Inform.Theory , vol. 39, pp. 733–742, Mar. 1993.[3] R. Ahlswede and I. Csisz´ar, “Common randomness ininformation theory and cryptography – Part I: Secretsharing,”
IEEE Trans. Inform. Theory , vol. 39, pp.1121–1132, Jul. 1993.[4] S. I. Gel’fand and M. S. Pinsker, “Coding for channelswith random parameters,”
Problems of Control and In-formation Theory , vol. 9, pp. 19–31, 1980.[5] M. H. Costa, “Writing on dirty paper,”
IEEE Trans.Inform. Theory , vol. 29, pp. 439–441, May 1983.[6] C. Mitrapant, H. Vinck, and Y. Luo, “An achievableregion for the gaussian wiretap channel with side in-formation,”
IEEE Trans. Inform. Theory , vol. 52, pp.2181–2190, May 2006.[7] W. Liu and B. Chen, “Wiretap channel with two-sidedstate information,” in
Proc. 41st Asilomar Conf. onSignals, Systems and Comp. , Nov. 2007.[8] Y. Steinberg, “Simultaneous transmission of data andstate with common knowledge,” in
Proc. Int. Symp. In-form. Theory , Toronto, Canada, Jul. 2008, pp. 935–939.[9] ——, “Coding and common reconstruction,”
IEEETrans. Inform. Theory , vol. 55, pp. 4995–5010, Nov.2009.[10] A. Khisti, S. N. Diggavi, and G. W. Wornell, “Secret-keyagreement using asymmetric in channel state informa-tion,” in
Proc. Int. Symp. Inform. Theory , 2009.[11] V. Prabhakaran, K. Eswaran, and K. Ram-chandran, “Secrecy via sources and chan-nels,”
IEEE Trans. Inform. Theory ∼ vinodmp/ publications/ Secrecy09.pdf[12] Y. Chen and H. Vinck, “Wiretap channel with side in-formation,” in Proc. Int. Symp. Inform. Theory , Jun.2006.[13] A. Khisti, A. Tchamkerten, and G. W. Wornell, “SecureBroadcasting over Fading Channels,”