Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alex Dytso is active.

Publication


Featured researches published by Alex Dytso.


IEEE Transactions on Information Theory | 2016

Interference as Noise: Friend or Foe?

Alex Dytso; Daniela Tuninetti; Natasha Devroye

This paper shows that for the two-user Gaussian interference channel (G-IC) treating interference as noise without time sharing (TINnoTS) achieves the closure of the capacity region to within either a constant gap, or to within a gap of the order O(log(ln(min(S, I))/y)) up to a set of Lebesgue measure γ ∈ (0, 1], where S is the largest signal to noise ratio on the direct links and I is the largest interference to noise ratio on the cross links. As a consequence, TINnoTS is optimal from a generalized degrees of freedom (gDoF) perspective for all channel gains except for a subset of zero measure. TINnoTS with Gaussian inputs is known to be optimal within 1/2 bit for a subset of the weak interference regime. Rather surprisingly, this paper shows that TINnoTS is gDoF optimal in all parameter regimes, even in the strong and very strong interference regimes where joint decoding of Gaussian inputs is optimal. For approximate optimality of TINnoTS in all parameter regimes, it is critical to use non-Gaussian inputs. This paper thus proposes to use mixed inputs as channel inputs for the G-IC, where a mixed input is the sum of a discrete and a Gaussian random variable. Interestingly, with reference to the Han-Kobayashi achievable scheme, the discrete part of a mixed input is shown to effectively behave as a common message in the sense that, although treated as noise, its effect on the achievable rate region is as if it were jointly decoded together with the desired messages at a non-intended receiver. The practical implication is that a discrete interfering input is a friend, while an Gaussian interfering input is in general a foe. This paper also discusses other practical implications of the proposed TINnoTS scheme with mixed inputs. Since TINnoTS requires neither explicit joint decoding nor time sharing, the results of this paper are applicable to a variety of oblivious or asynchronous channels, such as the block asynchronous G-IC (which is not an information stable channel) and the G-IC with partial codebook knowledge at one or more receivers.


IEEE Transactions on Information Theory | 2015

On the Two-User Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver

Alex Dytso; Daniela Tuninetti; Natasha Devroye

In multiuser information theory, it is often assumed that every node in the network possesses all codebooks used in the network. This assumption may be impractical in distributed ad hoc, cognitive, or heterogeneous networks. This paper considers the two-user interference channel with one oblivious receiver (IC-OR), i.e., one receiver lacks knowledge of the interfering cookbook, whereas the other receiver knows both codebooks. This paper asks whether, and if so how much, the channel capacity of the IC-OR is reduced compared with that of the classical IC where both receivers know all codebooks. A novel outer bound is derived and shown to be achievable to within a gap for the class of injective semideterministic IC-ORs; the gap is shown to be zero for injective fully deterministic IC-ORs. An exact capacity result is shown for the general memoryless IC-OR when the nonoblivious receiver experiences very strong interference. For the linear deterministic IC-OR that models the Gaussian noise channel at high SNR, nonindependent identically distributed. Bernoulli(1/2) input bits are shown to achieve points not achievable by i.i.d. Bernoulli(1/2) input bits used in the same achievability scheme. For the real-valued Gaussian IC-OR, the gap is shown to be at most 1/2 bit per channel use, even though the set of optimal input distributions for the derived outer bound could not be determined. Toward understanding the Gaussian IC-OR, an achievability strategy is evaluated in which the input alphabets at the nonoblivious transmitter are a mixture of discrete and Gaussian random variables, where the cardinality of the discrete part is appropriately chosen as a function of the channel parameters. Surprisingly, as the oblivious receiver intuitively should not be able to jointly decode the intended and interfering messages (whose codebook is unavailable), it is shown that with this choice of input, the capacity region of the symmetric Gaussian IC-OR is to within 1/2 log (12πe)≈ 3.34 bits (per channel use per user) of an outer bound for the classical Gaussian IC with full codebook knowledge at both receivers.


information theory and applications | 2014

On discrete alphabets for the two-user Gaussian interference channel with one receiver lacking knowledge of the interfering codebook

Alex Dytso; Daniela Tuninetti; Natasha Devroye

In multi-user information theory it is often assumed that every node in the network possesses all codebooks used in the network. This assumption is however impractical in distributed ad-hoc and cognitive networks. This work considers the two-user Gaussian Interference Channel with one Oblivious Receiver (G-IC-OR), i.e., one receiver lacks knowledge of the interfering cookbook while the other receiver knows both codebooks. We ask whether, and if so how much, the channel capacity of the G-IC-OR is reduced compared to that of the classical G-IC where both receivers know all codebooks. Intuitively, the oblivious receiver should not be able to jointly decode its intended message along with the unintended interfering message whose codebook is unavailable. We demonstrate that in strong and very strong interference, where joint decoding is capacity achieving for the classical G-IC, lack of codebook knowledge does not reduce performance in terms of generalized degrees of freedom (gDoF). Moreover, we show that the sum-capacity of the symmetric G-IC-OR is to within O(log(log(SNR))) of that of the classical G-IC. The key novelty of the proposed achievable scheme is the use of a discrete input alphabet for the non-oblivious transmitter, whose cardinality is appropriately chosen as a function of SNR.


international symposium on information theory | 2014

On Gaussian interference channels with mixed gaussian and discrete inputs

Alex Dytso; Natasha Devroye; Daniela Tuninetti

This paper studies the sum-rate of a class of memoryless, real-valued additive white Gaussian noise interference channels (IC) achievable by treating interference as noise (TIN). We develop and analytically characterize the rates achievable by a new strategy that uses superpositions of Gaussian and discrete random variables as channel inputs. Surprisingly, we demonstrate that TIN is sum-generalized degrees of freedom optimal and can achieve to within an additive gap of O(1) or O(log log(SNR)) to the symmetric sum-capacity of the classical IC. We also demonstrate connections to other channels such as the IC with partial codebook knowledge and the block asynchronous IC.


international conference on communications | 2012

On the capacity of the symmetric interference channel with a cognitive relay at high SNR

Alex Dytso; Natasha Devroye; Daniela Tuninetti

The capacity of the Interference Channel with a Cognitive Relay, a channel model which generalizes the broadcast, interference and cognitive interference channels, is still an open question. Towards understanding this complex channel, we first consider the binary linear deterministic model that approximates the Gaussian channel at high SNR. We consider symmetric channel gains and show achievability of a tightened version of a previously known outer bound for almost all channel parameters. Of particular interest in this channel model is how the cognitive relay may be used to simultaneously relay as well as cancel/neutralize interference at the two receivers. The achievability schemes used to prove capacity use combinations of three main strategies at the cognitive relay that we term bit cancellation, bit sharing, and bit (self)cleaning. We highlight the capacity achieving schemes in the different regimes, pointing out some of the interesting new behaviors seen at the cognitive relay.


international symposium on information theory | 2016

On the minimum mean p-th error in Gaussian noise channels and its applications

Alex Dytso; Ronit Bustin; Daniela Tuninetti; Natasha Devroye; H. Vincent Poor; Shlomo Shamai

The problem of estimating an arbitrary random vector from its observation corrupted by additive white Gaussian noise, where the cost function is taken to be the minimum mean


international symposium on information theory | 2013

On the capacity of interference channels with partial codebook knowledge

Alex Dytso; Natasha Devroye; Daniela Tuninetti

p


information theory and applications | 2016

On communications through a Gaussian noise channel with an MMSE disturbance constraint

Alex Dytso; Ronit Bustin; Daniela Tuninetti; Natasha Devroye; H. Vincent Poor; Shlomo Shamai Shitz

th error (MMPE), is considered. The classical minimum mean square error (MMSE) is a special case of the MMPE. Several bounds, properties, and applications of the MMPE are derived and discussed. The optimal MMPE estimator is found for Gaussian and binary input distributions. Properties of the MMPE as a function of the input distribution, signal-to-noise-ratio (SNR) and order


international symposium on information theory | 2017

On additive channels with generalized Gaussian noise

Alex Dytso; Ronit Bustin; H. Vincent Poor; Shlomo Shamai Shitz

p


international symposium on information theory | 2017

A generalized Ozarow-Wyner capacity bound with applications

Alex Dytso; Mario Goldenbaum; H. Vincent Poor; Shlomo Shamai Shitz

are derived. The “single-crossing-point property” (SCPP) which provides an upper bound on the MMSE, and which together with the mutual information-MMSE relationship is a powerful tool in deriving converse proofs in multi-user information theory, is extended to the MMPE. Moreover, a complementary bound to the SCPP is derived. As a first application of the MMPE, a bound on the conditional differential entropy in terms of the MMPE is provided, which then yields a generalization of the Ozarow–Wyner lower bound on the mutual information achieved by a discrete input on a Gaussian noise channel. As a second application, the MMPE is shown to improve on previous characterizations of the phase transition phenomenon that manifests, in the limit as the length of the capacity achieving code goes to infinity, as a discontinuity of the MMSE as a function of SNR. As a final application, the MMPE is used to show new bounds on the second derivative of mutual information, or the first derivative of the MMSE.

Collaboration


Dive into the Alex Dytso's collaboration.

Top Co-Authors

Avatar

Daniela Tuninetti

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Natasha Devroye

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Ronit Bustin

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shlomo Shamai

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shlomo Shamai Shitz

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ronit Bustin

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge