Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aaron D. Wyner is active.

Publication


Featured researches published by Aaron D. Wyner.


IEEE Transactions on Information Theory | 1976

The rate-distortion function for source coding with side information at the decoder

Aaron D. Wyner; Jacob Ziv

Let \{(X_{k}, Y_{k}) \}^{ \infty}_{k=1} be a sequence of independent drawings of a pair of dependent random variables X, Y . Let us say that X takes values in the finite set \cal X . It is desired to encode the sequence \{X_{k}\} in blocks of length n into a binary stream of rate R , which can in turn be decoded as a sequence \{ \hat{X}_{k} \} , where \hat{X}_{k} \in \hat{ \cal X} , the reproduction alphabet. The average distortion level is (1/n) \sum^{n}_{k=1} E[D(X_{k},\hat{X}_{k})] , where D(x,\hat{x}) \geq 0, x \in {\cal X}, \hat{x} \in \hat{ \cal X} , is a preassigned distortion measure. The special assumption made here is that the decoder has access to the side information \{Y_{k}\} . In this paper we determine the quantity R \ast (d) , defined as the infimum ofrates R such that (with \varepsilon > 0 arbitrarily small and with suitably large n )communication is possible in the above setting at an average distortion level (as defined above) not exceeding d + \varepsilon . The main result is that R \ast (d) = \inf [I(X;Z) - I(Y;Z)] , where the infimum is with respect to all auxiliary random variables Z (which take values in a finite set \cal Z ) that satisfy: i) Y,Z conditionally independent given X ; ii) there exists a function f: {\cal Y} \times {\cal Z} \rightarrow \hat{ \cal X} , such that E[D(X,f(Y,Z))] \leq d . Let R_{X | Y}(d) be the rate-distortion function which results when the encoder as well as the decoder has access to the side information \{ Y_{k} \} . In nearly all cases it is shown that when d > 0 then R \ast(d) > R_{X|Y} (d) , so that knowledge of the side information at the encoder permits transmission of the \{X_{k}\} at a given distortion level using a smaller transmission rate. This is in contrast to the situation treated by Slepian and Wolf [5] where, for arbitrarily accurate reproduction of \{X_{k}\} , i.e., d = \varepsilon for any \varepsilon >0 , knowledge of the side information at the encoder does not allow a reduction of the transmission rate.


vehicular technology conference | 1994

Information theoretic considerations for cellular mobile radio

Lawrence H. Ozarow; Shlomo Shamai; Aaron D. Wyner

We present some information-theoretic considerations used to determine upper bounds on the information rates that can be reliably transmitted over a two-ray propagation path mobile radio channel model, operating in a time division multiplex access (TDMA) regime, under given decoding delay constraints. The sense in which reliability is measured is addressed, and in the interesting eases where the decoding delay constraint plays a significant role, the maximal achievable rate (capacity), is specified in terms of capacity versus outage. In this case, no coding capacity in the strict Shannon sense exists. Simple schemes for time and space diversity are examined, and their potential benefits are illuminated from an information-theoretic stand point. In our presentation, we chose to specialize to the TDMA protocol for the sake of clarity and convenience. Our main arguments and results extend directly to certain variants of other multiple access protocols such as code division multiple access (CDMA) and frequency division multiple access (FDMA), provided that no fast feedback from the receiver to the transmitter is available. >


IEEE Transactions on Information Theory | 1994

Shannon-theoretic approach to a Gaussian cellular multiple-access channel

Aaron D. Wyner

We obtain Shannon-theoretic limits for a very simple cellular multiple-access system. In our model the received signal at a given cell site is the sum of the signals transmitted from within that cell plus a factor /spl alpha/ (0/spl les//spl alpha//spl les/1) times the sum of the signals transmitted from the adjacent cells plus ambient Gaussian noise. Although this simple model is scarcely realistic, it nevertheless has enough meat so that the results yield considerable insight into the workings of real systems. We consider both a one dimensional linear cellular array and the familiar two-dimensional hexagonal cellular pattern. The discrete-time channel is memoryless. We assume that N contiguous cells have active transmitters in the one-dimensional case, and that N/sup 2/ contiguous cells have active transmitters in the two-dimensional case. There are K transmitters per cell. Most of our results are obtained for the limiting case as N/spl rarr//spl infin/. The results include the following. (1) We define C/sub N/,C/spl circ//sub N/ as the largest achievable rate per transmitter in the usual Shannon-theoretic sense in the one- and two-dimensional cases, respectively (assuming that all signals are jointly decoded). We find expressions for limN/spl rarr//spl infin/C/sub N/ and limN/spl rarr//spl infin/C/spl circ//sub N/. (2) As the interference parameter /spl alpha/ increases from 0, C/sub N/ and C/spl circ//sub N/ increase or decrease according to whether the signal-to-noise ratio is less than or greater than unity. (3) Optimal performance is attainable using TDMA within the cell, but using TDMA for adjacent cells is distinctly suboptimal. (4) We suggest a scheme which does not require joint decoding of all the users, and is, in many cases, close to optimal. >


Information & Computation | 1978

The rate-distortion function for source coding with side information at the decoder\3-II: General sources

Aaron D. Wyner

In this paper we generalize (to nondiscrete sources) the results of a previous paper (Wyner and Ziv, 1976) on source coding with a fidelity criterion in a situation where the decoder (but not the encoder) has access to side information about the source. We define R*(d) as the minimum rate (in the usual Shannon sense) required for encoding the source at a distortion level about d. The main result is the characterization of R*(d) by an information theoretic minimization. In a special case in which the source and the side information are jointly Gaussian, it is shown that R*(d) is equal to the rate which would be required if the encoder (as well as the decoder) is informed of the side information.


IEEE Transactions on Information Theory | 1997

Information-theoretic considerations for symmetric, cellular, multiple-access fading channels. II

Shlomo Shamai; Aaron D. Wyner

For pt.I see ibid., vol.43, no.6, p.1877-94 (1997). A simple idealized linear (and planar) uplink, cellular, multiple-access communication model, where only adjacent cell interference is present and all signals may experience fading is considered. Shannon theoretic arguments are invoked to gain insight into the implications on performance of the main system parameters and multiple-access techniques. The model treated in Part I (Shamai, 1997) is extended here to account for cell-site receivers that may process also the received signal at an adjacent cell site, compromising thus between the advantage of incorporating additional information from other cell sites on one hand and the associated excess processing complexity on the other. Various settings which include fading, time-division multiple access (TDMA), wideband (WB), and (optimized) fractional inter-cell time sharing (ICTS) protocols are investigated and compared. In this case and for the WB approach and a large number of users per cell it is found, surprisingly, that fading may enhance performance in terms of Shannon theoretic achievable rates. The linear model is extended to account for general linear and planar configurations. The effect of a random number of users per cell is investigated and it is demonstrated that randomization is beneficial. Certain aspects of diversity as well as some features of TDMA and orthogonal code-division multiple access (CDMA) techniques in the presence of fading are studied in an isolated cell scenario.


IEEE Transactions on Information Theory | 1975

On source coding with side information at the decoder

Aaron D. Wyner

Let \{(X_k, Y_k, V_k)\}_{k=1}^{\infty} be a sequence of independent copies of the triple (X,Y,V) of discrete random variables. We consider the following source coding problem with a side information network. This network has three encoders numbered 0, 1, and 2, the inputs of which are the sequences \{ V_k\}, \{X_k\} , and \{Y_k\} , respectively. The output of encoder i is a binary sequence of rate R_i, i = 0,1,2 . There are two decoders, numbered 1 and 2, whose task is to deliver essentially perfect reproductions of the sequences \{X_k\} and \{Y_k\} , respectively, to two distinct destinations. Decoder 1 observes the output of encoders 0 and 1, and decoder 2 observes the output of encoders 0 and 2. The sequence \{V_k\} and its binary encoding (by encoder 0) play the role of side information, which is available to the decoders only. We study the characterization of the family of rate triples (R_0,R_1,R_2) for which this system can deliver essentially perfect reproductions (in the usual Shannon sense) of \{X_k\} and \{Y_k\} . The principal result is a characterization of this family via an information-theoretic minimization. Two special cases are of interest. In the first, V = (X, Y) so that the encoding of \{V_k \} contains common information. In the second, Y \equiv 0 so that our problem becomes a generalization of the source coding problem with side information studied by Slepian and Wo1f [3].


IEEE Transactions on Information Theory | 1989

Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compression

Aaron D. Wyner; Jacob Ziv

Theorems concerning the entropy of a stationary ergodic information source are derived and used to obtain insight into the workings of certain data-compression coding schemes, in particular the Lempel-Siv data compression algorithm. >


IEEE Transactions on Information Theory | 1991

Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs

Shlomo Shamai; Lawrence H. Ozarow; Aaron D. Wyner

Bounds are presented on I/sub i.i.d./-the achievable information rate for a discrete Gaussian Channel with intersymbol interference (ISI) present and i.i.d. channel input symbols governed by an arbitrary predetermined distribution p/sub x/(x). The lower and upper bounds on I/sub i.i.d./ and I are formulated. The bounds on I/sub i.i.d./ are calculated for independent equiprobably binary channel symbols and for causal channels with ISI memory of degree one and two. The bounds on I/sub i.i.d./ are compared to the approximated (by Monte Carlo methods) known value of I/sub i.i.d./ and their tightness is considered. An application of the new lower bound on I/sub i.i.d./ yields an improvement on previously reported lower bounds for the capacity of the continuous-time strictly bandlimited (or bandpass) Gaussian channel with either peak or simultaneously peak power and bandlimiting constraints imposed on the channels input waveform. >


Information & Computation | 1972

An upper bound on the entropy series

Aaron D. Wyner

An upper bound is established for the entropy corresponding to a positive integer valued random variable X in terms of the expectation of certain functions of X. In particular, we show that the entropy is finite if E log X


IEEE Transactions on Information Theory | 1975

A conditional entropy bound for a pair of discrete random variables

Hans S. Witsenhausen; Aaron D. Wyner

Let X, Y be a pair of discrete random variables with a given joint probability distribution. For 0 \leq x \leq H(X) , the entropy of X , define the function F(x) as the infimum of H(Y\mid W) , the conditional entropy of Y given W , with respect to all discrete random variables W such that a) H(X\mid W) = x , and b) W and Y are conditionally independent given X . This paper concerns the function F , its properties, its calculation, and its applications to several problems in information theory.

Collaboration


Dive into the Aaron D. Wyner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shlomo Shamai

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jack K. Wolf

University of California

View shared research outputs
Top Co-Authors

Avatar

Shlomo Shamai

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Abraham J. Wyner

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge