Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jacob Ziv is active.

Publication


Featured researches published by Jacob Ziv.


IEEE Transactions on Information Theory | 1977

A universal algorithm for sequential data compression

Jacob Ziv; Abraham Lempel

A universal algorithm for sequential data compression is presented. Its performance is investigated with respect to a nonprobabilistic model of constrained sources. The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable-to-block codes designed to match a completely specified source.


IEEE Transactions on Information Theory | 1978

Compression of individual sequences via variable-rate coding

Jacob Ziv; Abraham Lempel

Compressibility of individual sequences by the class of generalized finite-state information-lossless encoders is investigated. These encoders can operate in a variable-rate mode as well as a fixed-rate one, and they allow for any finite-state scheme of variable-length-to-variable-length coding. For every individual infinite sequence x a quantity \rho(x) is defined, called the compressibility of x , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for x by any finite-state encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, also provide useful performance criteria for finite and practical data-compression tasks. The proposed concept of compressibility is also shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences. While the definition of \rho(x) allows a different machine for each different sequence to be compressed, the constructive coding theorem leads to a universal algorithm that is asymptotically optimal for all sequences.


IEEE Transactions on Information Theory | 1976

The rate-distortion function for source coding with side information at the decoder

Aaron D. Wyner; Jacob Ziv

Let \{(X_{k}, Y_{k}) \}^{ \infty}_{k=1} be a sequence of independent drawings of a pair of dependent random variables X, Y . Let us say that X takes values in the finite set \cal X . It is desired to encode the sequence \{X_{k}\} in blocks of length n into a binary stream of rate R , which can in turn be decoded as a sequence \{ \hat{X}_{k} \} , where \hat{X}_{k} \in \hat{ \cal X} , the reproduction alphabet. The average distortion level is (1/n) \sum^{n}_{k=1} E[D(X_{k},\hat{X}_{k})] , where D(x,\hat{x}) \geq 0, x \in {\cal X}, \hat{x} \in \hat{ \cal X} , is a preassigned distortion measure. The special assumption made here is that the decoder has access to the side information \{Y_{k}\} . In this paper we determine the quantity R \ast (d) , defined as the infimum ofrates R such that (with \varepsilon > 0 arbitrarily small and with suitably large n )communication is possible in the above setting at an average distortion level (as defined above) not exceeding d + \varepsilon . The main result is that R \ast (d) = \inf [I(X;Z) - I(Y;Z)] , where the infimum is with respect to all auxiliary random variables Z (which take values in a finite set \cal Z ) that satisfy: i) Y,Z conditionally independent given X ; ii) there exists a function f: {\cal Y} \times {\cal Z} \rightarrow \hat{ \cal X} , such that E[D(X,f(Y,Z))] \leq d . Let R_{X | Y}(d) be the rate-distortion function which results when the encoder as well as the decoder has access to the side information \{ Y_{k} \} . In nearly all cases it is shown that when d > 0 then R \ast(d) > R_{X|Y} (d) , so that knowledge of the side information at the encoder permits transmission of the \{X_{k}\} at a given distortion level using a smaller transmission rate. This is in contrast to the situation treated by Slepian and Wolf [5] where, for arbitrarily accurate reproduction of \{X_{k}\} , i.e., d = \varepsilon for any \varepsilon >0 , knowledge of the side information at the encoder does not allow a reduction of the transmission rate.


IEEE Transactions on Information Theory | 1969

Some lower bounds on signal parameter estimation

Jacob Ziv; Moshe Zakai

New bounds are presented for the maximum accuracy with which parameters of signals imbedded in white noise can be estimated. The bounds are derived by comparing the estimation problem with related optimal detection problems. They are, with few exceptions, independent of the bias and include explicitly the dependence on the a priori interval. The new results are compared with previously known results.


IEEE Transactions on Information Theory | 1989

Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compression

Aaron D. Wyner; Jacob Ziv

Theorems concerning the entropy of a stationary ergodic information source are derived and used to obtain insight into the workings of certain data-compression coding schemes, in particular the Lempel-Siv data compression algorithm. >


IEEE Transactions on Information Theory | 1986

Compression of two-dimensional data

Abraham Lempel; Jacob Ziv

Distortion-free compressibility of individual pictures, i.e., two-dimensional arrays of data, by finite-state encoders is investigated. For every individual infinite picture I , a quantity \rho(I) is defined, called the compressibility of I , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for I by any finite-state information-lossless encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, might also provide useful criteria for finite and practical data-compression tasks. The proposed picture compressibility is also shown to possess the properties that one would expect and require of a suitably defined concept of two-dimensional entropy for arbitrary probabilistic ensembles of infinite pictures. While the definition of \rho(I) allows the use of different machines for different pictures, the constructive coding theorem leads to a universal compression scheme that is asymptotically optimal for every picture. The results are readily extendable to data arrays of any finite dimension.


IEEE Transactions on Information Theory | 1975

Improved Lower Bounds on Signal Parameter Estimation

D. Chazan; Moshe Zakai; Jacob Ziv

An improved technique for bounding the mean-square error of signal parameter estimates is presented. The resulting bounds are independent of the bias and stronger than previously known bounds.


IEEE Transactions on Information Theory | 1970

Transmission of noisy information to a noisy receiver with minimum distortion

Jack K. Wolf; Jacob Ziv

This paper is concerned with the transmission of information with a fidelity criterion where the source output may be distorted prior to encoding and, furthermore, where the output of the decoder may be distorted prior to its delivery to the final destination. The criterion for optimality is that the normalized average of the squared norm of the difference between the T - second undistorted source sample and the corresponding T -second sample delivered to the final destination be minimum. The optimal structure of the encoder and decoder is derived for any T .


IEEE Transactions on Information Theory | 1978

Coding theorems for individual sequences

Jacob Ziv

A quantity called the {\em finite-state} complexity is assigned to every infinite sequence of elements drawn from a finite sot. This quantity characterizes the largest compression ratio that can be achieved in accurate transmission of the sequence by any finite-state encoder (and decoder). Coding theorems and converses are derived for an individual sequence without any probabilistic characterization, and universal data compression algorithms are introduced that are asymptotically optimal for all sequences over a given alphabet. The finite-state complexity of a sequence plays a role similar to that of entropy in classical information theory (which deals with probabilistic ensembles of sequences rather than an individual sequence). For a probabilistic source, the expectation of the finite state complexity of its sequences is equal to the sources entropy. The finite state complexity is of particular interest when the source statistics are unspecified.


IEEE Transactions on Information Theory | 1989

On the estimation of the order of a Markov chain and universal data compression

Neri Merhav; Michael Gutman; Jacob Ziv

The authors estimate the order of a finite Markov source based on empirically observed statistics. The performance criterion adopted is to minimize the probability of underestimating the model order while keeping the overestimation probability exponent at a prescribed level. A universal asymptotically optimal test, in the sense just defined, is proposed for the case where a given integer is known to be the upper bound of the true order. For the case where such a bound is unavailable, an alternative rule based on the Lempel-Ziv data compression algorithm is shown to be asymptotically optimal also and computationally more efficient. >

Collaboration


Dive into the Jacob Ziv's collaboration.

Top Co-Authors

Avatar

Abraham Lempel

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Neri Merhav

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Moshe Zakai

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Marcelo J. Weinberger

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shlomo Shamai

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Amos Lapidoth

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jack K. Wolf

University of California

View shared research outputs
Top Co-Authors

Avatar

Eli Plotnik

Technion – Israel Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge