Featured Researches

Information Theory

Bayes-Optimal Convolutional AMP

To improve the convergence property of approximate message-passing (AMP), convolutional AMP (CAMP) has been proposed. CAMP replaces the Onsager correction in AMP with a convolution of messages in all preceding iterations while it uses the same low-complexity matched filter (MF) as AMP. This paper derives state evolution (SE) equations to design the Bayes-optimal denoiser in CAMP. Numerical results imply that CAMP with the Bayes-optimal denoiser--called Bayes-optimal CAMP--can achieve the Bayes-optimal performance for right-orthogonally invariant sensing matrices with low-to-moderate condition numbers.

Read more
Information Theory

Beamformer Design with Smooth Constraint-Free Approximation in Downlink Cloud Radio Access Networks

It is known that data rates in standard cellular networks are limited due to inter-cell interference. An effective solution of this problem is to use the multi-cell cooperation idea. In Cloud Radio Access Network, which is a candidate solution in 5G and beyond, cooperation is applied by means of central processors (CPs) connected to simple remote radio heads with finite capacity fronthaul links. In this study, we consider a downlink scenario and aim to minimize total power spent by designing beamformers. We consider the case where perfect channel state information is not available in the CP. The original problem includes discontinuous terms with many constraints. We propose a novel method which transforms the problem into a smooth constraint-free form and a solution is found by the gradient descent approach. As a comparison, we consider the optimal method solving an extensive number of convex sub-problems, a known heuristic search algorithm and some sparse solution techniques. Heuristic search methods find a solution by solving a subset of all possible convex sub-problems. Sparse techniques apply some norm approximation ( ??0 / ??1 , ??0 / ??2 ) or convex approximation to make the objective function more tractable. We also derive a theoretical performance bound in order to observe how far the proposed method performs off the optimal method when running the optimal method is prohibitive due to computational complexity. Detailed simulations show that the performance of the proposed method is close to the optimal one, and it outperforms other methods analyzed.

Read more
Information Theory

Belief Propagation List Ordered Statistics Decoding of Polar Codes

It is shown how to combine ordered statistics decoding (OSD) with CRC-aided belief propagation list (CBPL) decoding of polar codes. Even when the reprocessing order of the OSD is as low as one, the new decoder is shown to significantly improve on CBPL. For reprocessing orders higher than one, we suggest partial reprocessing, where only error patterns associated with the least reliable part of the belief propagation decoded most reliable independent bits are considered. This type of partial reprocessing offers a trade-off between performance and computational complexity.

Read more
Information Theory

Belief-Propagation Decoding of LDPC Codes with Variable Node-Centric Dynamic Schedules

Belief propagation (BP) decoding of low-density parity-check (LDPC) codes with various dynamic decoding schedules have been proposed to improve the efficiency of the conventional flooding schedule. As the ultimate goal of an ideal LDPC code decoder is to have correct bit decisions, a dynamic decoding schedule should be variable node (VN)-centric and be able to find the VNs with probable incorrect decisions and having a good chance to be corrected if chosen for update. We propose a novel and effective metric called conditional innovation (CI) which serves this design goal well. To make the most of dynamic scheduling which produces high-reliability bit decisions, we limit our search for the candidate VNs to those related to the latest updated nodes only. Based on the CI metric and the new search guideline separately or in combination, we develop several highly efficient decoding schedules. To reduce decoding latency, we introduce multi-edge updating versions which offer extra latency-performance tradeoffs. Numerical results show that both single-edge and multi-edge algorithms provide better decoding performance against most dynamic schedules and the CI-based algorithms are particularly impressive at the first few decoding iterations.

Read more
Information Theory

Beurling-type density criteria for system identification

This paper addresses the problem of identifying a linear time-varying (LTV) system characterized by a (possibly infinite) discrete set of delay-Doppler shifts without a lattice (or other geometry-discretizing) constraint on the support set. Concretely, we show that a class of such LTV systems is identifiable whenever the upper uniform Beurling density of the delay-Doppler support sets, measured uniformly over the class, is strictly less than 1/2. The proof of this result reveals an interesting relation between LTV system identification and interpolation in the Bargmann-Fock space. Moreover, we show that this density condition is also necessary for classes of systems invariant under time-frequency shifts and closed under a natural topology on the support sets. We furthermore show that identifiability guarantees robust recovery of the delay-Doppler support set, as well as the weights of the individual delay-Doppler shifts, both in the sense of asymptotically vanishing reconstruction error for vanishing measurement error.

Read more
Information Theory

Beyond Capacity: The Joint Time-Rate Region

The traditional notion of capacity studied in the context of memoryless network communication builds on the concept of block-codes and requires that, for sufficiently large blocklength n, all receiver nodes simultaneously decode their required information after n channel uses. In this work, we generalize the traditional capacity region by exploring communication rates achievable when some receivers are required to decode their information before others, at different predetermined times; referred here as the "time-rate" region. Through a reduction to the standard notion of capacity, we present an inner-bound on the time-rate region. The time-rate region has been previously studied and characterized for the memoryless broadcast channel (with a sole common message) under the name "static broadcasting".

Read more
Information Theory

Binary Polar Codes Based on Bit Error Probability

This paper introduces techniques to construct binary polar source/channel codes based on the bit error probability of successive-cancellation decoding. The polarization lemma is reconstructed based on the bit error probability and then techniques to compute the bit error probability are introduced. These techniques can be applied to the construction of polar codes and the computation of lower and upper bounds of the block decoding error probability.

Read more
Information Theory

Binary Subspace Chirps

We describe in details the interplay between binary symplectic geometry and quantum computation, with the ultimate goal of constructing highly structured codebooks. The Binary Chirps (BCs) are Complex Grassmannian Lines in N= 2 m dimensions used in deterministic compressed sensing and random/unsourced multiple access in wireless networks. Their entries are fourth roots of unity and can be described in terms of second order Reed-Muller codes. The Binary Subspace Chirps (BSSCs) are a unique collection of BCs of ranks ranging from r=0 to r=m , embedded in N dimensions according to an on-off pattern determined by a rank r binary subspace. This yields a codebook that is asymptotically 2.38 times larger than the codebook of BCs, has the same minimum chordal distance as the codebook of BCs, and the alphabet is minimally extended from {±1,±i} to {±1,±i,0} . Equivalently, we show that BSSCs are stabilizer states, and we characterize them as columns of a well-controlled collection of Clifford matrices. By construction, the BSSCs inherit all the properties of BCs, which in turn makes them good candidates for a variety of applications. For applications in wireless communication, we use the rich algebraic structure of BSSCs to construct a low complexity decoding algorithm that is reliable against Gaussian noise. In simulations, BSSCs exhibit an error probability comparable or slightly lower than BCs, both for single-user and multi-user transmissions.

Read more
Information Theory

Bit Error Rate Analysis for Reconfigurable Intelligent Surfaces with Phase Errors

In this paper, we analyze the error probability of reconfigurable intelligent surfaces (RIS)-enabled communication systems with quantized channel phase compensation over Rayleigh fading channels. The probability density and characteristic functions of the received signal amplitude are derive dand used to compute exact expressions for the bit error rate(BER). The resulting expressions are general, as they hold for an arbitrary number of reflecting elements N, and quantization levels, L. We introduce an exact asymptotic analysis in the high signal-to-noise ratio (SNR) regime, from which we demonstrate, in particular, that the diversity order is N/2 when L=2 and N when L >2. The theoretical frameworks and findings are validated with the aid of Monte Carlo simulation.

Read more
Information Theory

Blocked and Hierarchical Disentangled Representation From Information Theory Perspective

We propose a novel and theoretical model, blocked and hierarchical variational autoencoder (BHiVAE), to get better-disentangled representation. It is well known that information theory has an excellent explanatory meaning for the network, so we start to solve the disentanglement problem from the perspective of information theory. BHiVAE mainly comes from the information bottleneck theory and information maximization principle. Our main idea is that (1) Neurons block not only one neuron node is used to represent attribute, which can contain enough information; (2) Create a hierarchical structure with different attributes on different layers, so that we can segment the information within each layer to ensure that the final representation is disentangled. Furthermore, we present supervised and unsupervised BHiVAE, respectively, where the difference is mainly reflected in the separation of information between different blocks. In supervised BHiVAE, we utilize the label information as the standard to separate blocks. In unsupervised BHiVAE, without extra information, we use the Total Correlation (TC) measure to achieve independence, and we design a new prior distribution of the latent space to guide the representation learning. It also exhibits excellent disentanglement results in experiments and superior classification accuracy in representation learning.

Read more

Ready to get started?

Join us today