Guangyue Han
University of Hong Kong
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Guangyue Han.
IEEE Transactions on Information Theory | 2006
Guangyue Han; Brian Marcus
We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions, the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters
IEEE Transactions on Information Theory | 2006
Guangyue Han; Joachim Rosenthal
The diversity product and the diversity sum are two very important parameters for a good-performing unitary space-time constellation. A basic question is what the maximal diversity product (or sum) is. In this correspondence, we are going to derive general upper bounds on the diversity sum and the diversity product for unitary constellations of any dimension n and any size m using packing techniques on the compact Lie group U(n)
IEEE Transactions on Information Theory | 2010
Guangyue Han; Brian Marcus
We derive an asymptotic formula for entropy rate of a hidden Markov chain under certain parameterizations. We also discuss applications of the asymptotic formula to the asymptotic behaviors of entropy rate of hidden Markov chains as outputs of certain channels, such as binary symmetric channel, binary erasure channel, and some special Gilbert-Elliot channel.
Annals of Applied Probability | 2009
Guangyue Han; Brian Marcus
We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman in [28], we derive an asymptotic formula (when the noise parameter is small) for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a BSC. Using this result we establish an asymptotic formula for the capacity of a BSC with input process supported on an irreducible flnite type constraint, as the noise parameter tends to zero. 1. Introduction and Background. Let X;Y be discrete random variables with alphabet X;Y and joint probability mass function pX;Y (x;y) 4 P(X = x;Y = y), x 2 X;y 2 Y (for notational simplicity, we will write p(x;y) rather than pX;Y (x;y), similarly p(x);p(y) rather than pX(x);pY (y), respectively, when it is clear from the context). The entropy H(X) of the discrete random variable X, which measures the level of uncertainty of X, is deflned as (in this paper log is taken to mean the natural logarithm)
IEEE Transactions on Information Theory | 2012
Guangyue Han; Brian Marcus
We consider a memoryless channel with an input Markov process supported on a mixing finite-type constraint. We continue the development of asymptotics for the entropy rate of the output hidden Markov chain and deduce that, at high signal-to-noise ratio, the mutual information rate of such a channel is concave with respect to “almost” all input Markov chains of a given order.
IEEE Transactions on Information Theory | 2006
Guangyue Han; Joachim Rosenthal
There exist two important design criteria for unitary space time codes. In the situation where the signal-to-noise ratio (SNR) is large the diversity product (DP) of a constellation should be as large as possible. It is less known that the diversity sum (DS) is a very important design criterion for codes working in a low SNR environment. So far, no general method to design good-performing constellations with large diversity for any number of transmit antennas and any transmission rate exists. In this correspondence, we propose constellations with suitable structures, which allow one to construct codes with excellent diversity using geometrical symmetry and numerical methods. The presented design methods work for any dimensional constellation and for any transmission rate
international symposium on information theory | 2013
Yonglong Li; Guangyue Han
The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-standing open problem in information theory. The capacity of a memoryless channel can be effectively computed via the classical Blahut-Arimoto algorithm (BAA), which, however, does not apply to a general FSC. Recently Vontobel et al. [1] generalized the BAA to compute the capacity of a finite-state machine channel with a Markovian input. Their proof of the convergence of this algorithm, however, depends on the concavity conjecture posed in their paper. In this paper, we confirm the concavity conjecture for some special FSCs. On the other hand, we give examples to show that the conjecture is not true in general.
IEEE Transactions on Information Theory | 2013
Guangyue Han
In this paper, under mild assumptions, we derive a law of large numbers, a central limit theorem with an error estimate, an almost sure invariance principle, and a variant of the Chernoff bound in finite-state hidden Markov models. These limit theorems are of interest in certain areas of information theory and statistics. Particularly, we apply the limit theorems to derive the rate of convergence of the maximum likelihood estimator in finite-state hidden Markov models.
IEEE Transactions on Information Theory | 2016
Guangyue Han; Jian Song
Unveiling a fundamental link between information theory and estimation theory, the I-MMSE relationship by Guo et al., together with its numerous extensions, has great theoretical significance and various practical applications. On the other hand, its influences to date have been restricted to channels without feedback or memory, due to the absence of its extensions to such channels. In this paper, we propose the extensions of the I-MMSE relationship to discrete-time and continuous-time Gaussian channels with feedback and/or memory. Our approach is based on a very simple observation, which can be applied to other scenarios, such as a simple and direct proof of the classical de Bruijns identity.
IEEE Transactions on Information Theory | 2015
Guangyue Han
Inspired by ideas from the field of stochastic approximation, we propose a randomized algorithm to compute the capacity of a finite-state channel with a Markovian input. When the mutual information rate of the channel is concave with respect to the chosen parameterization, the proposed algorithm proves to be convergent to the capacity of the channel almost surely with the derived convergence rate. We also discuss the convergence behavior of the algorithm without the concavity assumption.