IEEE Transactions on Communications | 2019

Markov Chain Monte Carlo Methods for Lattice Gaussian Sampling: Convergence Analysis and Enhancement

 

Abstract


Sampling from lattice Gaussian distribution has emerged as an important problem in coding, decoding, and cryptography. In this paper, the classic Gibbs algorithm from Markov chain Monte Carlo (MCMC) methods is demonstrated to be geometrically ergodic for lattice Gaussian sampling, which means that the Markov chain arising from it converges exponentially fast to the stationary distribution. Meanwhile, the exponential convergence rate of the Markov chain is also derived through the spectral radius of the forward operator. Then, a comprehensive analysis of the convergence rate is carried out, and two sampling schemes are proposed to further enhance the convergence performance. The first one, referred to as a Metropolis-within-Gibbs (MWG) algorithm, improves the convergence by refining the state space of the univariate sampling. The second is a blocked strategy of the Gibbs algorithm, which performs sampling over multivariates at each Markov move, and is shown to yield a better convergence rate than the traditional univariate sampling. In order to perform blocked sampling efficiently, the Gibbs–Klein (GK) algorithm is proposed, which samples block by block using the Kleins algorithm. Furthermore, the validity of the GK algorithm is demonstrated by showing its ergodicity. Simulation results based on MIMO detections are presented to confirm the convergence gain brought by the proposed Gibbs sampling schemes.

Volume 67
Pages 6711-6724
DOI 10.1109/TCOMM.2019.2926470
Language English
Journal IEEE Transactions on Communications

Full Text