On the transitional behavior of quantum Gaussian memory channels
aa r X i v : . [ qu a n t - ph ] M a y The transitional behavior of quantum Gaussian memory channels
C. Lupo and S. Mancini , School of Science and Technology, University of Camerino, I-62032 Camerino, Italy, EU INFN-Sezione di Perugia, I-06123 Perugia, Italy, EU
We address the question of optimality of entangled input states in quantum Gaussian memory channels. For aclass of such channels, which can be traced back to the memoryless setting, we state a criterion which relates theoptimality of entangled inputs to the symmetry properties of the channels’ action. Several examples of channelmodels belonging to this class are discussed.
PACS numbers: 03.67.Hk, 03.67.Mn
I. INTRODUCTION
In communication theory, it is generally believed that mem-ory effects improve the information transfer capabilities of acommunication line [1]. Memory effects can be introducedboth by the presence of correlations in the noise affectingdifferent channel uses (inputs), and by interference amongtheme.In the quantum communication scenario the problem of de-termining the optimal ensemble of input states, depending onthe memory, naturally arises. An optimal ensemble is themost robust under the action of the noisy channel, leading tothe highest transmission rates. In particular, for the classi-cal information transmission through quantum memory chan-nels, the possibility of discriminating between the optimalityof separable or entangled input states would be useful.Then, by the transitional behavior it is intended the possi-bility to single out two “phases” in the channel capacity. Onefor which the optimal input states are entangled among dif-ferent channel uses, and the other for which the optimal in-put states are separable. To model the memory effects it iscustomary to introduce a memory parameter , which quanti-fies the amount of correlations in the quantum channel. Thetransition between the two “phases” may happen at a finitevalue of the memory parameter, implying that separable in-put states are optimal in the presence of small correlations, orat zero value of the memory parameter, implying that sepa-rable states are optimal only in the memoryless limit. Sev-eral models showing such an effect have been proposed fordiscrete quantum memory channels [2] as well as for contin-uous ones [3]. However, the majority of these works restricttheir analysis to few channel uses (thus not relying on capacityarguments) and above all do not provide general criterion tocharacterize the transitional behavior. Here we present neces-sary conditions to have the transitional behavior for a class ofquantum Gaussian memory channels. Actually we show thatsuch behavior is intimately related to the symmetry propertiesof the channels’ action. This is possible thanks to the intro-duction of a technique which allows us to unravel the memoryeffects.The article is organized as follows. In Sec. II we reviewthe basic tools for working out quantum Gaussian memorychannels, and to the problem of evaluating the channel ca-pacity. In Sec. III we consider the problem of computingthe Holevo function for a Gaussian memory channel, and in- troduce a class of Gaussian memory channels for which thisproblem can be traced back to the memoryless setting. In Sec.IV we enunciate a criterion for the transitional behavior andwe relate the optimality of entangled inputs to the symme-try properties of the channels’ action. In Sec. V illuminatingexamples are presented. Finally, Sec. VI is for concluding re-marks.
II. GAUSSIAN STATES AND GAUSSIAN CHANNELS
Gaussian quantum states and Gaussian quantum channelsare defined in the context of continuous variable quantumsystems. Here we consider the case of a continuous vari-able quantum system consisting of n identical quantum har-monic oscillators (for a complete presentation of the subjectsee for instance [4]). The n quantum harmonic oscillatorsare associated with a set of n canonical operators ˆ q , ˆ p , ˆ q , ˆ p , . . . ˆ q n , ˆ p n , satisfying canonical commutation relations [ˆ q h , ˆ p k ] = iδ hk (here and in what follows we assume ~ = 1 ).The state of the n modes can be described, using the for-malism of density operator, as a certain density ˆ ρ n defined inthe n -mode Fock space. However, we find it more convenientto work in the Wigner function representation. Let us recallthat the Wigner function associated to a density ˆ ρ n is definedas W n ( q , p ) = Z d n y h q − y | ˆ ρ n | q + y i e iπ y · p , (1)where we have introduced the numerical vectors q :=( q , . . . q n ) , p := ( p , . . . p n ) , y := ( y , . . . y n ) ∈ R n , andwe have denoted | q ± y i := ⊗ nk =1 | q k ± y k i (2)the joint (generalized) eigenstates of the ‘position’ operators { ˆ q k } k =1 ,...n , i.e. ˆ q k | q ± y i = ( q k ± y k ) | q ± y i , for all k =1 , . . . n .By definition, Gaussian states are those described by aGaussian Wigner function. In the n -mode scenario, theWigner function of a Gaussian state is a multivariate Gaus-sian function: W n ( x ) = exp (cid:2) − ( x − m ) T V − ( x − m ) (cid:3) (2 π ) n p det( V ) , (3)where we have introduced the numerical vector x :=( q , p , q , p , . . . q n , p n ) T ∈ R n . The Wigner function ofan n -mode Gaussian state is hence completely described bythe vector of the n first moments m = h x i = Z x W n ( x ) d n x , (4)and by the n × n covariance matrix (CM) V = h ( x − m )( x − m ) T i = Z ( x − m )( x − m ) T W n ( x ) d n x . (5)Finally, let us notice that certain conditions have to be im-posed on the CM, to ensure that the Wigner function describesa bona fide quantum state. Indeed, the Heisenberg principleimposes the condition [5] V − i Ω ≥ , (6)where Ω = n M k =1 (cid:18) − (cid:19) (7)is the n × n matrix representing the n -mode symplecticform.We will also consider an equivalent representation definedby a different ordering of the canonical variables, expressedby the numerical vector ˜x = ( q , q , . . . q n , p , p , . . . p n ) T ,with the corresponding vector of first moments ˜m and the CM ˜ V = h ( ˜x − ˜m )( ˜x − ˜m ) T i . (8)In this representation, the symplectic form is represented bythe matrix ˜Ω = (cid:18) O I n − I n O (cid:19) , (9)where O denotes the null matrix, and I n the identity matrix ofsize n .A Gaussian quantum channel acting on n bosonic modes (inshort, an n -mode Gaussian channel) is by definition a channelmapping Gaussian states into Gaussian states. As a conse-quence, its action on Gaussian states is completely character-ized by the rule of transformations of the vector of first mo-ments and of the CM. One can show (see, e.g., [6]) that aGaussian channel transforms the pair ( m , V ) (vector of firstmoment, CM) as follows: ( m , V ) ( X m + d , XV X T + Y ) . (10)Where d ∈ R n is a displacement vector, and X , Y are two n × n matrices. In order to represent a bona fide quantumchannel, these matrices have to obey the inequalities Y + iX Ω X T − i Ω ≥ , Y ≥ , (11)In conclusion, a Gaussian channel is characterized by the triad ( d , X, Y ) , satisfying Eq. (11). Given a pair of Gaussian channels: Φ with associated triad ( d , X, Y ) , and Φ ′ with triad ( d ′ , X ′ , Y ′ ) , the composition ofthe two channels Φ ′ ◦ Φ is associated to the triad ( X ′ d + d ′ , X ′ X, X ′ Y X ′ T + Y ′ ) .In the family of Gaussian channels, a special sub-family isthose of unitary Gaussian channels. Unitary Gaussian chan-nels are characterized by the conditions Y = 0 , and X Ω X T = Ω . (12)The last equation characterizes linear symplectic transforma-tions, where the matrix X is symplectic. In conclusion, aGaussian unitary transformation is characterized by the triad ( d , X, Y ) , where d is generic, X is symplectic, and Y = 0 .In the following we consider the subgroup of Gaussian uni-tary transformations preserving the total number of excita-tions, such maps are represented by matrices X which areboth symplectic and orthogonal. Using the representation (8),it is possible to show (see, e.g., [4] and the references therein)that those matrices are of the form ˜ X = (cid:18) A B − B A (cid:19) , (13)where A , B are real matrices of size n such that the matrix A + i B is unitary. A. Gaussian memory channels, normal forms
For any integer n , n uses of a Gaussian channel transform n input bosonic modes into n output bosonic modes. The ac-tion of a Gaussian quantum channel is hence described bya sequence of Gaussian channels Φ n , acting on n bosonicmodes, which is in turn associated to the sequence of triads ( d n , X n , Y n ) .A very special case is that of the memoryless channels, forwhich Φ n = Φ ⊗ n is the direct product of n identical one-mode Gaussian channels. Each of these identical one-modeGaussian channels is characterized by a triad ( d , X , Y ) .Hence, a memoryless channel is characterized by a sequence ( d n , X n , Y n ) = ( L nk =1 d , L nk =1 X , L nk =1 Y ) . Noticethat d = ( d q , d p ) T , and we have denoted L nk =1 d :=( d q , d p , d q , d p , . . . d q , d p ) T ∈ R n . Also notice that X , Y are × matrices, and the direct sums L nk =1 X , L nk =1 Y are n × n block-diagonal matrices.Let us now consider the case of a quantum channel withmemory. We call a channel with memory , or simply a memorychannel , any channel which is not memoryless. Making no as-sumption on additional structures that might be present (e.g.causality, invariance under time translations), we can only saythat the associated sequence of n -mode Gaussian channelssatisfies Φ n = Φ ⊗ n , and the associated sequence of triadsis ( d n , X n , Y n ) = ( L nk =1 d , L nk =1 X , L nk =1 Y ) .The problem of finding normal forms for n -mode Gaus-sian channels was considered in [7, 8], the case of one-modechannel was considered in [9]. Normal forms are equivalenceclasses of n -mode Gaussian channels, up to (Gaussian) uni-tary equivalence. Hence, given a pair of n -mode Gaussianchannels Φ n , Φ ′ n , they are equivalent if there are Gaussianunitary transformations E n , D n such that Φ ′ n = D n ◦ Φ n ◦ E n ( ◦ denotes the composition of channels). As the chosen nota-tion suggests, we want to look at E n , D n respectively as uni-tary encoding and decoding transformations, E n anticipatingand D n following the action of the quantum channel. We willhence write ( d n , X n , Y n ) ≃ ( d ′ n , X ′ n , Y ′ n ) if the two Gaussianquantum channels are equivalent in the sense declared above.The first thing to be noticed is that ( d n , X n , Y n ) ≃ (0 , X n , Y n ) . This is a well known result, a consequence ofthe fact that the displacement vector can always be eliminatedby applying a proper n -mode displacement operator (which isa Gaussian unitary transformation) at the encoding, or decod-ing, stage. For this reason, in what follows we only consider n -mode Gaussian channels of the form (0 , X n , Y n ) . Let usconsider a quantum channel Φ n with the triad (0 , X n , Y n ) , anencoding E n with triad (0 , E n , , and decoding D n with triad (0 , D n , . The application of the encoding and decoding uni-taries leads to the dressed channel Φ ′ n associated to the triad (0 , D n X n E n , D n Y n D T n ) .As was shown in [7], an n -mode Gaussian channel is al-ways unitary equivalent to a n -mode channel in a normalform , i.e. (0 , X n , Y n ) ≃ (0 , X ′ n , Y ′ n ) , where X ′ n = " p M h =1 X ( h )2 n M k =2 p +1 X ( k )1 (14)is the direct sum of p two-mode ( × ) matrices X ( h )2 , and n − p one-mode ( × ) matrices X ( k )1 . In other words, byapplying suitable encoding and decoding unitaries, the matrix X n is reduced to the direct sum of two-mode and one-modeterms. However, it is important to notice that the matrix Y ′ n cannot be in general jointly reduced to the same form. B. Classical capacity of Gaussian channels
A quantum channel can be used to transmit classical infor-mation by encoding a classical stochastic continuous variable Z , distributed according to a probability density distribution p Z , into a set of quantum states ˆ ρ Z .In the case of a memoryless quantum channel Φ n = Φ ⊗ n ,the maximum rate at which classical information can be reli-ably sent through the quantum channel is given by the regu-larized limit [10] C = lim n →∞ n χ (cid:0) Φ ⊗ n (cid:1) , (15)where the Holevo function χ evaluated on n channel uses is χ (cid:0) Φ ⊗ n (cid:1) = max { ˆ ρ Z ,p Z } (cid:26) S (cid:20) Φ ⊗ n (cid:18)Z dZp Z ˆ ρ Z (cid:19)(cid:21) − Z dZp Z S (cid:2) Φ ⊗ n (ˆ ρ Z ) (cid:3)(cid:27) , (16)where S denotes the von Neumann entropy. If the von Neu-mann entropy is expressed in qubits, then the classical capac-ity of the quantum channel in measured in bits per channel use. The computation of the memoryless channel capacity isbased on the optimization over all input ensembles, includ-ing those made of states which are entangled among differentchannel uses. On the other hand, if the input states are re-stricted to ensembles of separable states, one obtains the so-called one-shot capacity C = χ (Φ ) . (17)Clearly the one-shot capacity is a lower bound on the mem-oryless channel capacity. If the two quantities coincide, theHolevo function is said to be additive. Additivity of theHolevo function dramatically simplifies the problem of eval-uating the memoryless channel capacity. Even though theHolevo function has been shown to be additive for severalrelevant channels, e.g. the lossy channel in the framework ofGaussian channels [11], this property does not hold in general[12]. Non-additivity of the Holevo function implies that theoptimal ensembles of input states, i.e. the most robust to noise,are entangled among different channel uses: a phenomenonwhich has no counterpart in the classical theory of informa-tion.Moving to the case of quantum channels with memory,characterized by the inequality Φ n = Φ ⊗ n , one could betempted to generalize the formula in Eq. (15) and write C ≃ lim n →∞ n χ (Φ n ) , (18)with χ (Φ n ) = max { ˆ ρ Z ,p Z } (cid:26) S (cid:20) Φ n (cid:18)Z dZp Z ˆ ρ Z (cid:19)(cid:21) − Z dZp Z S [Φ n (ˆ ρ Z )] (cid:27) . (19)Indeed, it is possible to show [13] that the right hand siteof Eq. (18) is in general only an upper bound for the classi-cal capacity of the memory channel. On the other hand, ithas been proven that this quantity coincides with the mem-ory channel capacity for the class of so-called forgetful chan-nels [14]. Those channels have the property that correlationsamong channels uses decay exponentially. Moreover, they ex-hibit a causal structure and are invariant under time transla-tion.The problem of computing the classical capacity is ex-tremely hard. Indeed, if one cannot rely on the additiv-ity property, one has to evaluate the regularized limit of theHolevo information, whose complexity increases exponen-tially in n . Moreover, for the case of bosonic channels, therelevant Hilbert space is infinite dimensional even for a sin-gle channel use, i.e. n = 1 . Clearly, an infinite dimensionalHilbert space could carry an infinite amount of classical bits.Hence, to avoid unphysical results, one is led to introduce aphysically motivated constraint, and to exploit the concept ofconstrained capacity. A typical choice in the framework ofbosonic Gaussian channels is to impose a constraint on themaximal input energy per channel use, hence one introducesthe constrained Holevo function χ N (Φ n ) = max { ˆ ρ Z ,p Z } (cid:26) S (cid:20) Φ n (cid:18)Z dZp Z ˆ ρ Z (cid:19)(cid:21) − Z dZp Z S [Φ n (ˆ ρ Z )] | n X k =1 tr (cid:18) ˆ q k + ˆ p k n Z dZp Z ˆ ρ Z (cid:19) ≤ N + 12 (cid:27) , (20)where we have assumed unit frequency for the bosonic os-cillators, and N represents the maximum number of excita-tions per mode on average. One can conjecture that the max-imum is reached in correspondence with a Gaussian ensem-bles. Indeed, the optimality of the Gaussian ensemble hasbeen proven for the lossy channel [11] and conjectured forother families of bosonic Gaussian channels [15].Here, we estimate the Holevo function when restricted toGaussian encoding [16]. We consider Gaussian encoding de-fined as follows. For n uses of the quantum channel, we fix areference n -mode Gaussian state, with zero mean, describedby the Wigner function W ( x ) = exp (cid:2) − x T V − x (cid:3) (2 π ) n p det ( V in ) . (21)A classical variable m ∈ R n is hence encoded by applying adisplacement operation on the reference state, thus obtaining W m ( x ) = exp (cid:2) − ( x − m ) T V − ( x − m ) (cid:3) (2 π ) n p det ( V in ) . (22)We assume the stochastic variable m to be itself distributedaccording to the Gaussian probability density distribution withzero mean: p m = exp (cid:2) − m T V − m (cid:3) (2 π ) n p det ( V c ) . (23)By linearity of the relation (1), the corresponding ensemblestate Z d n m p Z ˆ ρ m (24)is itself described by a Gaussian Wigner function, i.e. Z d n m p m W m ( x ) = exp (cid:2) − x T ( V in + V c ) − x (cid:3) (2 π ) n p det ( V in + V c ) . (25)The restriction to Gaussian states, which are mapped intoGaussian states by Gaussian channels, dramatically simpli-fies the problem, since the complexity of Gaussian states ispolynomial in the number of modes n . Moreover, the vonNeumann entropy can be calculated in terms of the symplec-tic eigenvalues of the CM.The symplectic eigenvalues of an n -mode CM V are de-fined as follows. Notice that the matrix V Ω , where Ω is the symplectic form introduced in Eq. (7), has n purely imag-inary eigenvalues {± iν k } k =1 ,...n , where the n real numbers { ν k } k =1 ,...n are the symplectic eigenvalues of the CM V .From the uncertainty relations, expressed by Eq. (6), it fol-lows that the symplectic eigenvalues satisfy the inequalities ν k ≥ / , which are saturated by pure Gaussian states [4].The von Neumann entropy of an n -mode Gaussian state ˆ ρ ,characterized by a CM V , is given by the formula: S [ˆ ρ ] = n X k =1 g (cid:18) ν k − (cid:19) , (26)where we have introduced the function g defined by g ( x ) = ( x + 1) log ( x + 1) − x log ( x ) . (27)The von Neumann entropy of a Gaussian state ˆ ρ is determinedby its CM V , it is hence convenient to define a function Σ ofthe CM such that Σ[ V ] = S (ˆ ρ ) . (28)Finally, we notice that the input energy constraint in Eq.(20) can be written in terms of the CM as follows: tr( V in + V c )2 n ≤ N + 12 . (29) III. THE HOLEVO FUNCTION FOR GAUSSIANENSEMBLES
For n uses of the quantum channel, the constrained Holevofunction, when restricted over Gaussian ensembles, reads χ NG (Φ n ) = max V in ,V c n (cid:8) Σ (cid:2) X n ( V in + V c ) X T n + Y n (cid:3) − Σ (cid:2) X n V in X T n + Y n (cid:3) | tr( V in + V c )2 n ≤ N + 12 (cid:27) , (30)where the optimization is over the CMs V in , V c satisfying theenergy constrains.In the case of a memoryless Gaussian channel, by restrict-ing on Gaussian input states which are separable among dif-ferent channel uses, we get to the one-shot capacity: χ NG (Φ ) = max V in ,V c (cid:8) Σ (cid:2) X ( V in + V c ) X T + Y (cid:3) − Σ (cid:2) X V in X T + Y (cid:3) | tr( V in + V c )2 ≤ N + 12 (cid:27) . (31)In general, for a Gaussian memory channel, characterizedby the sequence of triads ( d n , X n , Y n ) , the optimization ofthe n -use Holevo function in Eq. (30) cannot be reduced to theone-use case in Eq. (31). That is a consequence of the the nor-mal form for an n -mode Gaussian channel in Eq. (14), whichin general is not in the form the product of n independent one-mode Gaussian channels. However, we can still identify aclass of Gaussian memory channels such that, for any n , thereare Gaussian unitary encoding and decoding transformations E n = (0 , E n , , D n = (0 , D n , , such that D n ◦ Φ n ◦ E n = n O k =1 Φ ( k )1 , (32)where { Φ ( k )1 } k =1 ,...n is a collection of n (not necessarily iden-tical) one-mode Gaussian channels. In other words, the mem-ory channels belonging to that class factorize in terms of thecollective input and output variables defined by the encodingand decoding transformations. Hence we introduce the fol-lowing Definition 1 (memory unraveling)
A bosonic Gaussianmemory channel, characterized by a sequence (0 , X n , Y n ) ,can be unraveled if there is a sequence of encoding Gaussianunitaries (0 , E n , , and a sequence of decoding Gaussianunitaries (0 , D n , , such that, for any n , D n X n E n = n M k =1 X ( k ) ,n , (33) D n Y n D T n = n M k =1 Y ( k ) ,n . (34) Moreover, we require that the encoding unitary preserves theform of the energy constraint.
Since the Holevo function is invariant under unitary trans-formations, the transformation (32) preserves the Holevofunction, i.e. χ NG [Φ n ] = χ NG " n O k =1 Φ ( k )1 . (35)Moreover, since the encoding Gaussian unitary preserves theform of the energy constraint, we have tr (cid:2) E n ( V in + V c ) E T n (cid:3) = tr [ V in + V c ] , (36)which is satisfied if E T n E n = 1 , i.e. if the matrix E n is orthog-onal. Symplectic matrices that are also orthogonal constitutea subgroup whose elements are characterized by the form inEq. (13).For a memory channel that can be unraveled, it is natural torestrict to Gaussian input ensembles such that E n V in E T n = M k =1 ,...n V ( k )1 , in , (37)and E n V c E T n = M k =1 ,...n V ( k )1 , c . (38) Using this ansatz, the calculation of the Holevo function for n uses of the memory channel reduces to the one-mode case: χ NG [Φ n ] = 1 n max { N k | P k N k /n = N } n X k =1 max { V ( k )1 , in ,V ( k )1 , c } (cid:26) Σ (cid:20) X ( k ) ,n ( V ( k )1 , in + V ( k )1 , c ) X ( k ) ,n T + Y ( k ) ,n (cid:21) − Σ (cid:20) X ( k ) ,n V ( k )1 , in X ( k ) ,n T + Y ( k ) ,n (cid:21) | tr( V ( k )1 , in + V ( k )1 , c )2 ≤ N k + 12 ) , (39)where we have rewritten the input energy constraint in twosteps ( tr (cid:16) V ( k ) ,n + V ( k )1 , c (cid:17) / ≤ N k + 1 / , P nk =1 N k /n = N , (40)and the maximization is over both the CMs V ( k )1 , in , V ( k )1 , c ,and over the positive integers N k under the constraint P k N k /n = N .In conclusion, for quantum memory channels that can beunraveled, the calculation of the Holevo function has been re-duced to the case of independent, but not identical, one-modechannels, each characterized by the matrices X ( k ) ,n , Y ( k ) ,n and a maximal number N k of excitations per mode. The onlyingredient that mixes the one-mode channels is the constrainton the total number of excitations. A. Examples
The properties defining the class of Gaussian memory chan-nels that can be unraveled are rather peculiar, however severalrelevant models of Gaussian channels belong to this class. Inthis section we review some examples of Gaussian memorychannels that can (or cannot) be unraveled.
Lossy Bosonic memory channel.
We refer to the generalmodel of lossy bosonic Gaussian channel with memory thathas been introduced in [17]. Upon n uses of the channel, n input modes are mixed with a corresponding set of n environ-mental modes at a beam splitter with transmissivity η . In theHeisenberg picture the canonical field operators transform as ˆ q k → √ η ˆ q k + p − η ˆ Q k (41) ˆ p k → √ η ˆ p k + p − η ˆ P k , (42)where { ˆ Q k , ˆ P k } k =1 ,...n are the field operators of the environ-mental modes. The environmental modes are in a correlatedGaussian state, which is characterized by a CM V env with non-vanishing off-diagonal terms coupling different modes. Thememory channel is associated with the matrices X n = √ η I n , Y n = (1 − η ) V env . Memory effects appear in the channelif the n -mode environment is in a non-factorized state. Afactorized state is characterized by a block-diagonal n -modeCM, i.e. V env = L nk =1 v k . Remarkable examples of non-factorized states are the multimode entangled states, whichbelong to the family of multimode squeezed states. To ad-dress the problem of memory unraveling, we notice that theCM V env can be diagonalized by a n × n orthogonal ma-trix O , i.e. OV env O T = L nk =1 v k . Hence one could identify D n := O , E n := O T . Notice that the orthogonal matrixpreserves the trace, hence it preserves the energy constraint.However, a Gaussian unitary is represented with a symplecticmatrix, hence the given orthogonal matrix represents a physi-cal transformation only if it is also symplectic. In conclusion,a lossy bosonic memory channel can be unraveled if the CMof the environment can be diagonalized by a linear transfor-mation which is both symplectic and orthogonal. A character-ization of this class of environment CMs is presented in [18]:pure Gaussian states and squeezed thermal states belong tothis class. Additive noise channel.
For this class of channels, the fieldoperators transforms according to the Heisenberg picture map: ˆ q k → ˆ q k + t k , (43) ˆ p k → ˆ p k + u k , (44)where t k , u k are classical stochastic variables. The channel isGaussian if the noise variables are Gaussian distributed, witha CM V cl . The subscript ’ cl ’ refers to the fact that V cl is aclassical CM, i.e. it is only subject to the conditions of beingsymmetric and positive semi-definite. Differently from thequantum CM, it needs not obey the Heisenberg uncertaintyrelations as expressed by Eq. (6). Memory effects arise whenthe V cl has nonvanishing off-diagonal terms coupling differentchannel uses. The matrices associated to n uses of the channelare X n = I n , and Y n = V cl . The memory channel can hencebe unraveled if there is a n × n matrix S , being both sym-plectic and orthogonal, such that S n V cl S T n = L nk =1 v k . Theencoding and decoding Gaussian unitaries which unravel thememory channel are hence chosen as D n := S n , E n := S T n .Notice that since S n is orthogonal, S T n = S − n . Examples ofadditive noise channels with memory that can be unraveledwere studied in [19, 20]. The conditions on the CM V cl can beeasily obtained as it is done for the lossy channel in [18], withthe only difference that V cl needs not obey the uncertainty re-lations. Inter-symbol interference channels.
In this family of quan-tum channels memory effects come from the fact that the sig-nals at different channel inputs do interfere at the channel out-put, while in the previous examples they are caused by noisecorrelations. For such a case, one can write the Heisenbergpicture transformations acting on the field operators as ˆ q k → X h n M ( qq ) kh ˆ q h + M ( qp ) kh ˆ p h + N ( qQ ) kh ˆ Q k + N ( qP ) kh ˆ P h o , ˆ p k → X h n M ( pq ) kh ˆ q h + M ( pp ) kh ˆ p h + N ( pQ ) kh ˆ Q k + N ( pP ) kh ˆ P h o , where the operators { ˆ Q k , ˆ P k } correspond to environmentalmodes, and the matrices M ( qq ) , M ( qp ) , . . . , N ( pQ ) , N ( pP ) sat-isfy proper conditions [4]. To describe this channel, we work in the representation (8). In this representation, assuming thatthe environmental modes are in a Gaussian state character-ized by the CM ˜ V env , the matrices associated with the memorychannel are ˜ X n = (cid:18) M ( qq ) M ( qp ) M ( pq ) M ( pp ) (cid:19) , (45)and ˜ Y n = (cid:18) N ( qQ ) N ( qP ) N ( pQ ) N ( pP ) (cid:19) ˜ V env (cid:18) N ( qQ ) N ( qP ) N ( pQ ) N ( pP ) (cid:19) T . (46)An instance of this kind of model was considered in [21], inwhich M ( qq ) = M ( pp ) , M ( qp ) = M ( pq ) = O , N ( qQ ) = N ( pP ) , N ( qP ) = N ( pQ ) = O , and the environment is in the vacuumstate, i.e. ˜ V env = I n / . As is shown in [21], such a channelcan be unraveled by performing the singular value decompo-sition of the matrix X n . IV. OPTIMIZATION UNDER SYMMETRIES
In the cases of both memoryless and memory quantumchannels, one can pose the question of finding the optimal in-put ensemble, i.e. the most robust one under the action of thenoisy channel. In the memoryless setting, this is related to theissue of additivity of the Holevo information: if the Holevo in-formation is additive the optimal input ensemble constitutes ofstates which are separable among different channel uses; oth-erwise, for channels with a non-additive Holevo informationthe optimal input ensemble is made of entangled states. Formemory channels it has been observed, in the cases of bothdiscrete [2] and continuous [3] variables, that entangled inputstates may be optimal to maximize the Holevo information.In models for quantum channels with memory, is customaryto introduce a memory parameter , used to quantify the mem-ory in the channel, which vanishes in the memoryless limit. Itmay happen that the optimal input ensemble constitutes of en-tangled states when the memory parameter is above a certainthreshold. In this case one says that the memory channel ex-hibits a transitional behavior . In the case of discrete variables,several models exhibit a finite value of the memory threshold[2], while for continuous-variable models the threshold valuemay vanish [3, 20], i.e. entangled input states are optimal evenfor arbitrary small, but not zero, values of the memory param-eter.Restricting to the case of bosonic Gaussian channels thatcan be unraveled, here we introduce a criterion to decidewhether the Holevo information, restricted on Gaussian inputensembles and under input energy constraint, is optimized byseparable input states. We formulate the following:
Criterion 1
Given a Gaussian memory channel, representedby the sequence (0 , X n , Y n ) , and provided that it can be un-raveled, a necessary condition for the optimality of entangledinput states is the non invariance under phase rotation. Proof of Criterion 1
For a Gaussian memory chan-nel that can be unraveled, it holds (0 , X n , Y n ) ≃ (0 , L k =1 ,...n X ( k ) ,n , L k =1 ,...n Y ( k ) ,n ) . Let us assume, bycontradiction, that the one-mode channels (0 , X ( k ) ,n , Y ( k ) ,n ) are invariant under phase rotation, i.e. R ( θ ) X ( k ) ,n R ( θ ) T = X ( k ) ,n , (47) R ( θ ) Y ( k ) ,n R ( θ ) T = Y ( k ) ,n , (48)where the × matrix R ( θ ) represents a phase rotation R ( θ ) := (cid:18) cos θ − sin θ sin θ cos θ (cid:19) . (49)Matrices that have this symmetry are scalar, i.e. X ( k ) ,n = (cid:18) x ( k ) ,n x ( k ) ,n (cid:19) , (50) Y ( k ) ,n = (cid:18) y ( k ) ,n y ( k ) ,n (cid:19) . (51)It follows that the solution of the optimization problem (39)is also invariant under phase rotation and is given by the ma-trices: V ( k )1 , in = (cid:18) / / (cid:19) , V ( k )1 , c = (cid:18) N k N k (cid:19) , (52)where the optimal values of the parameters { N k } k =1 ,...n areobtained from the maximization problem χ n = max { N k } n n X k =1 n g h ( x ( k ) ,n ) ( N k + 1 /
2) + y ( k ) ,n − / i − g h ( x ( k ) ,n ) (1 /
2) + y ( k ) ,n − / io , (53)where the maximum is under the constraint P nk =1 N k /n = N .Let us notice that the matrix V ( k )1 , in in (52) represents the CMof coherent states [4]. Then, the matrix L nk =1 V ( k ) ,n repre-sents the CM of the optimal n -mode input state of the dressed memory channel, which includes the encoding unitary trans-formation. The actual optimal input state is obtained from itby undoing the encoding transformation, i.e. V optin = E n " n M k =1 V ( k ) ,n E T n . (54)However, since the encoding matrix E n is orthogonal (and, ofcourse symplectic), it follows that V optin = n M k =1 (cid:18) / / (cid:19) , (55)i.e. the optimal Gaussian inputs are coherent states, which areseparable among different channel uses. Remark 1
If the decoding symplectic matrix D n is also or-thogonal, the conditions (47), (48) can be equivalently formu-lated as follows: R n ( θ ) X n R n ( θ ) T = X n , (56) R n ( θ ) Y n R n ( θ ) T = Y n , (57) where the matrix R n ( θ ) = L nk =1 R ( θ ) represents a globalphase rotation on the n modes. The equivalence can be readily proved by working in therepresentation (8). It is a consequence of the fact that the sub-group of symplectic and orthogonal matrices [having the formas in Eq. (13)] commutes with phase rotations, which are rep-resented by matrices of the following form: ˜ R n = (cid:18) cos θ I n − sin θ I n sin θ I n cos θ I n (cid:19) . (58)In other words, if both the encoding and decoding symplec-tic matrices are orthogonal, the symmetry under phase rota-tion can be checked directly on the matrices X n , Y n .Furthermore, being the D n , E n elements of Lie groups,they reduce to identity when the group’s parameters reduce tozero. Since the latter would characterize the degree of mem-ory, we may argue that the transition can only occur at zerovalue of the memory parameters. This is in contrast to whathappens in discrete quantum memory channels.As a consequence of the Criterion 1, entangled Gaussiancodewords may be necessary for optimizing the Holevo infor-mation (39) only if the rotational invariance is broken. Noticehowever that this is a necessary but not sufficient condition.Below we are going to present examples for all possible cases. V. EXAMPLES
Let us first consider the case of an additive noise chan-nel. The definition of the channel and its basic properties arebriefly recalled in Sec. III A. Let us recall that for n uses ofthe channel we have X n = I n , and Y n = V cl . Since weare considering channels which can be unraveled, we assumethe existence of an orthogonal and symplectic matrix S n , suchthat S n V cl S T n = L nk =1 v k , with D n = S n , E n = S T n . In thiscase, the symmetry condition is verified if the matrix V cl issymmetric under phase rotation (remark 1). Two models ofGaussian memory channels with Markovian correlated noisewere studied and characterized in [19, 20]. Using the repre-sentation (8), the noise CM in [19] has the following form ˜ V cl = (cid:18) V OO V (cid:19) , (59)which is clearly symmetric under phase rotations (58), hencethe optimal Gaussian inputs are separable for this model. Onthe contrary, the noise CM for the model studied in [20] hasthe form ˜ V cl = (cid:18) V OO V ′ (cid:19) , (60)with V = V ′ . Such a matrix is not symmetric under phaserotations (58) and, as shown in [20], the optimal input states,when restricted to Gaussian states, are entangled.The case of lossy bosonic memory channel [17] is analo-gous to the additive channel. Its basic properties are reviewedin Sec. III A. In this case X n = √ η I n , and Y n = (1 − η ) V env .A model of memory channel belonging to this family has beenstudied in [18]. Using the representation (8), the environmen-tal CM in [18] has the form ˜ V env = (cid:18) T + 12 (cid:19) (cid:18) e M s OO e − M s (cid:19) , (61)where M is a symmetric matrix of size n , and T , s are twopositive parameters. The parameter s quantifies the amount ofmemory in the channel: For s = 0 the environmental state isan uncorrelated thermal state, while it is entangled for s = 0 .For all s = 0 , the matrix Y n is not symmetric under phase ro-tation and, as shown in [18], the optimal Gaussian input statesare entangled among different channel uses.The general case of an inter-symbol interference channelis recalled in Sec. III A. An example of such a channel wasstudied and characterized in [21], where ˜ X n = (cid:18) M OO M (cid:19) , (62)and ˜ Y n = 12 (cid:18) NN T OO NN T (cid:19) . (63)The channel is invariant under phase rotation, thus, as shownin [21], the optimal input states are separable. To conclude,we notice that there are models of Gaussian memory chan-nel for which the optimal Gaussian input states are separable,even though the channel is not symmetric under phase rota-tion. A channel model presenting this feature can be obtainedby choosing the environment to be in a state of the form ˜ V env = (cid:18) e s I n OO e − s I n (cid:19) , (64)implying ˜ Y n = 12 (cid:18) e s NN T OO e − s NN T (cid:19) , (65) and choosing ˜ X n as in (62). The channel is hence not sym-metric under phase rotation, however it is not difficult to showthat the optimal Gaussian input states are separable. VI. CONCLUSION
In this article we have provided a unified framework forsome recent results about the performance of quantum Gaus-sian memory channels. We have focused in particular onthe entanglement of optimal input states and we have relatedthis issue to the symmetry properties of the channel. Morespecifically we have shown that entangled Gaussian code-words might be necessary for optimizing the Holevo informa-tion only if the rotational invariance is broken by the channel’saction. Similar considerations were also done in [19, 20, 22]for specific channel models. Moreover, for a Gaussian mem-ory channel that can be unraveled, we may argue that the tran-sition from the optimality of separable states to the optimal-ity of entangled states may only occur for vanishing value ofthe memory parameter [18, 20]. This is in contrast to whathappens in discrete quantum memory channels [2]. However,while there investigations have only involved very few chan-nel uses, here the analysis has been carried out for arbitrarynumber of channel uses by resorting to a mathematical ma-chinery called memory unraveling. That allowed us to tracethe Gaussian memory channel back to a memoryless one. Sev-eral examples have been discussed concerning memory unrav-eling as well as transitional behavior.We think that the presented results shed light on the mech-anisms and the structure of the correlations that lead to an en-hancement of the channel performance with entanglement, al-though a complete characterization of memory channels tran-sition features is still far away.
Acknowledgments
This work has been supported by the European Commis-sion under the FET-Open grant agreement CORNER, numberFP7-ICT-213681. [1] R. G. Gallager,
Information Theory and Reliable Communica-tion (Wiley, New York, 1968).[2] C. Macchiavello and G. M. Palma, Phys. Rev. A , 050301(R)(2002); E. Karpov, D. Daems and N.J. Cerf, Phys. Rev. A ,032320 (2006); D. Daems, Phys. Rev. A , 012310 (2007);F. Caruso, V. Giovannetti, C. Macchiavello and M.B. Ruskai,Phys. Rev. A , 052323 (2008).[3] G. Ruggeri, G. Soliani, V. Giovannetti and S. Mancini, Euro-phys. Lett. , 719 (2005); G. Ruggeri and S. Mancini, Quan-tum Inf. Comput. , 265 (2007); O. Pilyavets, V. Zborovskii and S. Mancini, Physs Rev. A , 052324 (2008).[4] A. Ferraro, S. Olivares and M. G. A. Paris, Gaussian statesin quantum information (Bibliopolis, Napoli, 2005); eprintarXiv:0503237 [quant-ph]; S. L. Braunstein, P. van Loock, Rev.Mod. Phys. , 513 (2005).[5] R. Simon, N. Mukunda, and B. Dutta, Phys. Rev. A , 1567(1994).[6] A. S. Holevo, R. F. Werner, Phys. Rev. A , 032312 (2001).[7] M. M. Wolf, Phys. Rev. Lett. , 070505 (2008).[8] F. Caruso, J. Eisert, V. Giovannetti and A. S. Holevo, New J. Phys. , 083030 (2008).[9] A. S. Holevo, arXiv:quant-ph/0607051.[10] A. S. Holevo, IEEE Trans. Inf. Theory , 269 (1998); B. Schu-macher, M. D. Westmoreland, Phys. Rev. A , 131 (1997).[11] V. Giovannetti, S. Guha, S. Lloyd, L. Maccone, J. H. Shapiro,H. P. Yuen, Phys. Rev. Lett. , 027902 (2004).[12] M. B. Hastings, Nature Physics , 255 (2009).[13] G. Bowen, S. Mancini, Phys. Rev. A , 012306 (2004).[14] D. Kretschmann, R. F. Werner, Phys. Rev. A , 062323 (2005).[15] V. Giovannetti, S. Guha, S. Lloyd, L. Maccone, J. H. Shapiro,Phys. Rev. A , 032315 (2004).[16] A. S. Holevo, R. F. Werner Phys. Rev. A , 032312 (2001); A.S. Holevo, M. Sohma, O. Hirota, Phys. Rev. A , 1820 (1999). [17] V. Giovannetti and S. Mancini, Phys. Rev. A , 062304 (2005).[18] C. Lupo, O. V. Pilyavets, S. Mancini, New J. Phys. , 063023(2009).[19] C. Lupo, L. Memarzadeh, S. Mancini, Phys. Rev. A , 042328(2009).[20] J. Sch¨afer, D. Daems, E. Karpov, N. J. Cerf, Phys. Rev. A ,062313 (2009).[21] C. Lupo, V. Giovannetti, S. Mancini, Phys. Rev. Lett. ,030501 (2010).[22] N. J. Cerf, J. Clavareau, C. Macchiavello, J. Roland, Phys. Rev.A72