Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ronit Bustin is active.

Publication


Featured researches published by Ronit Bustin.


international symposium on information theory | 2009

An MMSE approach to the secrecy capacity of the MIMO Gaussian wiretap channel

Ronit Bustin; Ruoheng Liu; H. Vincent Poor; Shlomo Shamai

This paper provides a closed-form expression for the secrecy capacity of the multiple-input multiple-output (MIMO) Gaussian wiretap channel, under a power-covariance constraint. Furthermore, the paper specifies the input covariance matrix required in order to attain the capacity. The proof uses the fundamental relationship between information theory and estimation theory in the Gaussian channel, relating the derivative of the mutual information to the minimum mean-square error (MMSE). The proof provides the missing intuition regarding the existence and construction of an enhanced degraded channel that does not increase the secrecy capacity. The concept of enhancement has been used in a previous proof of the problem. Furthermore, the proof presents methods that can be used in proving other MIMO problems, using this fundamental relationship.


IEEE Transactions on Information Theory | 2013

On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels

Ronit Bustin; Miquel Payaró; Daniel Pérez Palomar; Shlomo Shamai

The scalar additive Gaussian noise channel has the “single crossing point” property between the minimum mean square error (MMSE) in the estimation of the input given the channel output, assuming a Gaussian input to the channel, and the MMSE assuming an arbitrary input. This paper extends the result to the parallel vector additive Gaussian channel in three phases. The channel matrix is the identity matrix, and we limit the Gaussian input to a vector of Gaussian i.i.d. elements. The “single crossing point” property is with respect to the signal-to-noise ratio (as in the scalar case). The channel matrix is arbitrary, and the Gaussian input is limited to an independent Gaussian input. A “single crossing point” property is derived for each diagonal element of the MMSE matrix. The Gaussian input is allowed to be an arbitrary Gaussian random vector. A “single crossing point” property is derived for each eigenvalue of the difference matrix between the two MMSE matrices. These three extensions are then translated to new information theoretic properties on the mutual information, using the I-MMSE relationship, a fundamental relationship between estimation theory and information theory revealed by Guo and coworkers. The results of the last phase are also translated to a new property of Fisher information. Finally, the applicability of all three extensions on information theoretic problems is demonstrated through a proof of a special case of Shannons vector entropy power inequality, a converse proof of the capacity region of the parallel degraded broadcast channel (BC) under an input per-antenna power constraint and under an input covariance constraint, and a converse proof of the capacity region of the compound parallel degraded BC under an input covariance constraint.


IEEE Transactions on Information Theory | 2013

MMSE of “Bad” Codes

Ronit Bustin; Shlomo Shamai

We examine codes, over the additive Gaussian noise channel, designed for reliable communication at some specific signal-to-noise ratio (SNR) and constrained by the permitted minimum mean-square error (MMSE) at lower SNRs. The maximum possible rate is below point-to-point capacity, and hence, these are nonoptimal codes (alternatively referred to as “bad” codes). We show that the maximum possible rate is the one attained by superposition codebooks. Moreover, the MMSE and mutual information behavior as a function of SNR, for any code attaining the maximum rate under the MMSE constraint, is known for all SNR. We also provide a lower bound on the MMSE for finite length codes, as a function of the error probability of the code.


international symposium on information theory | 2014

The effect of maximal rate codes on the interfering message rate

Ronit Bustin; H. Vincent Poor; Shlomo Shamai

It is shown that, for a subset of input distributions, the effect of maximum rate transmission on an additional transmitted message, over the additive Gaussian noise channel, is, effectively, that of an additional additive Gaussian noise. Meaning, that the behavior of the mutual information and minimum mean-square error are as if additional additive Gaussian noise were transmitted. Such an observation provides corner points of the two-user Gaussian interference channel, for this subset.


international symposium on information theory | 2010

On MMSE properties and I-MMSE implications in parallel MIMO Gaussian channels

Ronit Bustin; Miquel Payaró; Daniel Pérez Palomar; Shlomo Shamai

This paper extends the “single crossing point” property of the scalar MMSE function, derived by Guo, Shamai and Verdú (first presented in ISIT 2008), to the parallel degraded MIMO scenario. It is shown that the matrix Q(t), which is the difference between the MMSE assuming a Gaussian input and the MMSE assuming an arbitrary input, has, at most, a single crossing point for each of its eigenvalues. Together with the I-MMSE relationship, a fundamental connection between Information Theory and Estimation Theory, this new property is employed to derive results in Information Theory. As a simple application of this property we provide an alternative converse proof for the broadcast channel (BC) capacity region under covariance constraint in this specific setting.


information theory workshop | 2015

On MMSE properties of optimal codes for the Gaussian wiretap channel

Ronit Bustin; Rafael F. Schaefer; H. Vincent Poor; Shlomo Shamai

This work examines the properties of “good” codes for the scalar Gaussian wiretap channel that achieve the maximum level of equivocation. Specifically, the minimum mean-square error (MMSE) behavior of these codes is explored as a function of the signal-to-noise ratio (SNR). It is first shown that reliable decoding of the codeword at the legitimate receiver and at the eavesdropper, conditioned on the transmitted message, is a necessary and sufficient condition for an optimally secure code sequence. Moreover, it is observed that a stochastic encoder is required for any code sequence with rate below the channel point-to-point capacity. Then, for code sequences attaining the maximum level of equivocation, it is shown that their codebook sequences must resemble “good” point-to-point, capacity achieving, code sequences. Finally, it is shown that the mapping over such “good” codebook sequences that produces a maximum equivocation code must saturate the eavesdropper. These results support several “rules of thumb” in the design of capacity achieving codes for the Gaussian wiretap.


IEEE Transactions on Information Theory | 2016

On the SNR-Evolution of the MMSE Function of Codes for the Gaussian Broadcast and Wiretap Channels

Ronit Bustin; Rafael F. Schaefer; H. Vincent Poor; Shlomo Shamai Shitz

This paper considers the signal-to-noise ratio (SNR)-evolution, meaning the behavior as a function of the SNR, of the minimum mean-square error (MMSE) function of code sequences in several multi-user settings in the additive white Gaussian noise regime. The settings investigated in this context include the Gaussian wiretap channel, the Gaussian broadcast channel (BC), and the Gaussian BC with confidential messages (BCC). This paper shows that the specific properties of the SNR-evolution of the MMSE and conditional MMSE functions are necessary and sufficient conditions for capacity or equivocation achieving code sequences. In some cases, the complete SNR-evolution of a family of code sequences can be determined, providing significant insight into the disturbance (in terms of MMSE) such codes have on unintended receivers at other SNRs. Moreover, the effects of an additional MMSE constraint on the capacity region and on the SNR-evolution of code sequences are considered in the BC and BCC settings. Such an analysis emphasizes the tradeoff between rates and limited disturbance on unintended receivers.


convention of electrical and electronics engineers in israel | 2010

The I-MMSE approach on the weak Gaussian Z-interference channel and the type I Gaussian Broadcast-Z-interference channel

Ronit Bustin; Shlomo Shamai

A fundamental relationship between Estimation Theory and Information Theory for Gaussian channels was derived by Guo, Shamai and Verdú (first presented in ISIT 2008); in particular, it was shown that for the MIMO standard Gaussian channel, the mutual information and the minimum mean-square error (MMSE) are related. This fundamental relationship and its generalizations, referred to as the I-MMSE relationships, have already been shown to be useful in several aspects of Information Theory. An inherent property of the MMSE, is the “single crossing point” property: as a function of snr, the MMSE of the Gaussian input distribution and the MMSE of an arbitrary input distribution intersect at most once. In this work we will use this property and its recent extensions to the MIMO scenario. In this paper we take a look at two variations of the interference channel: the Gaussian Z-interference channel and the type I Gaussian Broadcast-Z-interference channel. We use the I-MMSE approach to derive outer bounds on the capacity region of these channels. The Z-interference problem is a simplified case of the more general interference channel, for which the capacity region is unknown, in general. The Gaussian Z-interference channel in its standard form is given by: Y1 = X1 + √aX2 + N1 Y2 = X2 + N2 where Ni and N2 are standard Gaussian and may be considered independent. We consider a power constraints Pi, i ϵ {1, 2}, on both inputs. For a ϵ (0, 1), that is, weak interference, there is no known single letter expression for the capacity region. The best known general outer bound is due to Sato. Using Ahlswedes limiting expression for the capacity region, we re-derive Satos outer bound directly from the limiting expression using the I-MMSE approach and show why this outer bound can not be tight in general. As an additional example of the usage of the I-MMSE approach, we examine the type I Gaussian Broadcast-Z-interference channel, recently presented by Shang and Poor. For this channel we show that the I-MMSE approach can derive equivalent, more natural and insightful, outer bound for the weak interference case.


international symposium on information theory | 2015

On MMSE properties of “good” and “bad” codes for the Gaussian broadcast channel

Ronit Bustin; Rafael F. Schaefer; H. Vincent Poor; Shlomo Shamai

This work examines the properties of code sequences for the scalar Gaussian broadcast channel (BC). Specifically, the behavior in terms of the mutual information and minimum mean-square error (MMSE) functions for all signal-to-noise ratios (SNRs) is explored. It is shown that “good”, capacity achieving, code sequences must follow the behavior of a capacity achieving superposition code sequence, even if they use a different encoding-decoding scheme (such as “Dirty Paper Coding”). Necessary and sufficient conditions for reliable decoding in general and specifically for “good” code sequences for the scalar Gaussian BC, in terms of the MMSE and conditional MMSE functions, are derived. Finally, “bad” code sequences, that do not obtain the capacity of the scalar Gaussian BC, are examined. These codes are defined by an additional MMSE constraint at some other SNR. This constraint limits the amount of disturbance these codes may have on some unintended receiver at that SNR. The capacity region, given this constraint, is fully depicted.


international symposium on information theory | 2017

On additive channels with generalized Gaussian noise

Alex Dytso; Ronit Bustin; H. Vincent Poor; Shlomo Shamai Shitz

This paper considers a problem of communication over an additive noise channel where the noise is distributed according to a Generalized Gaussian (GG) distribution. In the first part of the paper, a number of properties of the family of GG distributions are derived which are of independent interest. For example, considerable attention is given to the properties of the characteristic function of the GG distribution. In the second part of the paper, the capacity of an additive noise channel with GG noise is considered under p-th absolute moment constraints. It is shown that, even though Shannons upper bound is achievable in some instances, in general such achievability is not possible. Moreover, it is shown that discrete inputs can achieve capacity within a constant gap or full degree of freedom for any p-th absolute moment constraint. Following the seminal work of Smith, the paper also gives a condition under which discrete inputs are exactly optimal.

Collaboration


Dive into the Ronit Bustin's collaboration.

Top Co-Authors

Avatar

Shlomo Shamai

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rafael F. Schaefer

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Shlomo Shamai Shitz

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniela Tuninetti

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Natasha Devroye

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Pérez Palomar

Hong Kong University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Miquel Payaró

Hong Kong University of Science and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge