Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nadav Shulman is active.

Publication


Featured researches published by Nadav Shulman.


IEEE Transactions on Information Theory | 1999

Random coding techniques for nonrandom codes

Nadav Shulman; Meir Feder

This work provides techniques to apply the channel coding theorem and the resulting error exponent, which was originally derived for totally random block-code ensembles, to ensembles of codes with less restrictive randomness demands. As an example, the random coding technique can even be applied for an ensemble that contains a single code. For a specific linear code, we get an upper bound for the error probability, which equals Gallagers (1968) random coding bound, up to a factor determined by the maximum ratio between the weight distribution of the code, and the expected random weight distribution.


IEEE Transactions on Information Theory | 2004

The uniform distribution as a universal prior

Nadav Shulman; Meir Feder

In this correspondence, we discuss the properties of the uniform prior as a universal prior, i.e., a prior that induces a mutual information that is simultaneously close to the capacity for all channels. We determine bounds on the amount of the mutual information loss in using the uniform prior instead of the capacity-achieving prior. Specifically, for the class of binary input channels with any output alphabet, we show that the Z-channel has the minimal mutual information with uniform prior, out of all channels with a given capacity. From this, we conclude that the degradation of the mutual information with respect to the capacity is at most 0.011 bit, and as was shown previously, at most 6%. A related result is that the capacity-achieving prior, for any channel, is not far from uniform. Some of these results are extended to channels with nonbinary input.


information theory workshop | 2002

Source broadcasting with unknown amount of receiver side information

Meir Feder; Nadav Shulman

The Slepian-Wolf scheme for source coding with side information at the receiver, assures that the sender can send the source X at a rate of only the conditional entropy H(X|Y-) bits per source symbol, which is the minimal possible rate even if the sender knew the side information Y. However, the Slepian-Wolf result requires knowledge of the optimal required rate. In this paper we consider a situation where this rate is not known, possibly since the source is broadcasted to many heterogeneous receivers. The approach is based on recent results regarding sending a common information over a broadcast channel.


Lecture Notes in Computer Science | 2001

The Multicast Bandwidth Advantage in Serving a Web Site

Yossi Azar; Meir Feder; Eyal Lubetzky; Doron Rajwan; Nadav Shulman

Delivering popular web pages to the clients results in high bandwidth and high load on the web servers. A method to overcome this problem is to send these pages, requested by many users, via multicast. In this paper, we provide an analytic criterion to determine which pages to multicast, and analyze the overall saving factor as compared with a unicast delivery. The analysis is based on the well known observation that page popularity follows a Zipf-like distribution. Interestingly, we can obtain closed-form analytical expressions for the saving factor, that show the multicast advantage as a function of the site hit-rate, the allowed latency and the Zipf parameter.


IEEE Transactions on Information Theory | 2000

Improved error exponent for time-invariant and periodically time-variant convolutional codes

Nadav Shulman; Meir Feder

An improved upper bound on the error probability (first error event) of time-invariant convolutional codes, and the resulting error exponent, is derived. The improved error bound depends on both the delay of the code K and its width (the number of symbols that enter the delay line in parallel) b. Determining the error exponent of time-invariant convolutional codes is an open problem. While the previously known bounds on the error probability of time-invariant codes led to the block-coding exponent, we obtain a better error exponent (strictly better for b>1). In the limit b/spl rarr//spl infin/ our error exponent equals the Yudkin-Viterbi (1967, 1971, 1965) exponent derived for time-variant convolutional codes. These results are also used to derive an improved error exponent for periodically time-variant codes.


international symposium on information theory | 1995

A simple proof that time-invariant convolutional codes attain capacity

Nadav Shulman; Meir Feder

It is well known that time-varying convolutional codes can achieve the capacity of a discrete memoryless channel. The time varying assumption is needed in the proof to assure pairwise independency between the codewords. We provide a relatively simple proof that indeed time-invariant convolutional codes can achieve the capacity without any restriction (albeit, the error exponent achieved by our proof may not be the optimal).


Archive | 2013

Dynamically Allocating A Power Budget Over Multiple Domains Of A Processor

Avinash N. Ananthakrishnan; Efraim Rotem; Doron Rajwan; Eliezer Weissmann; Nadav Shulman


Archive | 2011

Estimating Temperature Of A Processor Core In A Low Power State

Avinash N. Ananthakrishnan; Efraim Rotem; Itai Feit; Tomer Ziv; Doron Rajwan; Nadav Shulman; Alon Naveh


international symposium on information theory | 2000

Static broadcasting

Nadav Shulman; M. Feder


Archive | 2011

METHOD, APPARATUS, AND SYSTEM FOR ENERGY EFFICIENCY AND ENERGY CONSERVATION INCLUDING THREAD CONSOLIDATION

Eliezer Weissmann; Efraim Rotem; Avinash N. Ananthakrishnan; Alon Naveh; Hisham Abu Salah; Nadav Shulman

Collaboration


Dive into the Nadav Shulman's collaboration.

Researchain Logo
Decentralizing Knowledge