Vyacheslav V. Prelov
Russian Academy of Sciences
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Vyacheslav V. Prelov.
IEEE Transactions on Information Theory | 1995
Mark S. Pinsker; Vyacheslav V. Prelov; Sergio Verdú
In some channels subject to crosstalk or other types of additive interference, the noise is the sum of a dominant Gaussian noise and a relatively weak non Gaussian contaminating noise. Although the capacity of such channels cannot be evaluated in general, the authors analyze the decrease in capacity, or sensitivity of the channel capacity to the weak contaminating noise. The main result is that for a very large class of contaminating noise processes, explicit expressions for the sensitivity of a discrete-time channel capacity do exist. Moreover, in those cases the sensitivity depends on the contaminating process distribution only through its autocorrelation function and so it coincides with the sensitivity with respect to a Gaussian contaminating noise with the same autocorrelation function.
IEEE Transactions on Information Theory | 1987
K. de Bruyn; Vyacheslav V. Prelov; E.C. van der Meulen
Necessary and sufficient conditions are derived for the transmission of two arbitrarily correlated sources over a discrete memoryless asymmetric multiple-access channel. It is shown that in this situation the classical separation principle of Shannon (the factorization of the joint source-channel transmission problem into separate source and channel coding problems) applies. This asymmetric case is the first non-trivial situation of a multiple-access channel with arbitrarily correlated sources in which the sufficient conditions found for the reliable transmission of the sources over the channel turn out to be necessary as well. Furthermore, it is demonstrated that these necessary and sufficient conditions continue to hold if feedback is available to one or both of the encoders.
Problems of Information Transmission | 2008
Vyacheslav V. Prelov; Edward C. van der Meulen
Some upper and lower bounds are obtained for the maximum of the absolute value of the difference between the mutual information |I(X; Y) − I(X′; Y′)| of two pairs of discrete random variables (X, Y) and (X′, Y′) via the variational distance between the probability distributions of these pairs. In particular, the upper bound obtained here substantially generalizes and improves the upper bound of [1]. In some special cases, our upper and lower bounds coincide or are rather close. It is also proved that the lower bound is asymptotically tight in the case where the variational distance between (X, Y) and (X′ Y′) tends to zero.
Problems of Information Transmission | 2007
Vyacheslav V. Prelov
We continue studying the relationship between mutual information and variational distance started in Pinsker’s paper [1], where an upper bound for the mutual information via variational distance was obtained. We present a simple lower bound, which in some cases is optimal or asymptotically optimal. A uniform upper bound for the mutual information via variational distance is also derived for random variables with a finite number of values. For such random variables, the asymptotic behaviour of the maximum of mutual information is also investigated in the cases where the variational distance tends either to zero or to its maximum value.
Problems of Information Transmission | 2002
Ilya Dumer; Mark S. Pinsker; Vyacheslav V. Prelov
Asymptotic behavior of the ε-entropy of an ellipsoid in a Hamming space is investigated as the dimension of the space grows.
Problems of Information Transmission | 2009
Vyacheslav V. Prelov
We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1–3] to the multivariate case.
General Theory of Information Transfer and Combinatorics | 2006
Ilya Dumer; Mark S. Pinsker; Vyacheslav V. Prelov
In this paper, we present some new results on the thinnest coverings that can be obtained in Hamming or Euclidean spaces if spheres and ellipsoids are covered with balls of some radius e. In particular, we tighten the bounds currently known for the e-entropy of Hamming spheres of an arbitrary radius r. New bounds for the e-entropy of Hamming balls are also derived. If both parameters e and r are linear in dimension n, then the upper bounds exceed the lower ones by an additive term of order logn. We also present the uniform bounds valid for all values of e and r. In the second part of the paper, new sufficient conditions are obtained, which allow one to verify the validity of the asymptotic formula for the size of an ellipsoid in a Hamming space. Finally, we survey recent results concerning coverings of ellipsoids in Hamming and Euclidean spaces.
international symposium on information theory | 2003
Vyacheslav V. Prelov; E.C. van der Meulen
Nonlinear channels with non-Gaussian noise where the transmitted signal is a random function of the input signal are considered. Under some assumptions on smoothness and the behavior of tails of the noise density function, higher-order asymptotics of the mutual information between the input and output signals in such channels is obtained, as the mean power of the input signal (or, equivalently, the signal-to-noise ratio) goes to zero.
Problems of Information Transmission | 2002
Vyacheslav V. Prelov; E.C. van der Meulen
The asymptotic behavior of the ε-entropy of ellipsoids in an n-dimensional Hamming space whose coefficients take only two different values is investigated as n → ∞. Explicit expressions for the main terms of the asymptotic expansion of ε-entropy of such ellipsoids are obtained under various relations between ε and parameters that define these ellipsoids.
international symposium on information theory | 1995
Mark S. Pinsker; Vyacheslav V. Prelov; Sergio Verdú
In some applications, channel noise is the sum of a Gaussian noise and a relatively weak non-Gaussian contaminating noise. Although the capacity of such channels cannot be evaluated in general, we analyze the decrease in capacity, or sensitivity of the channel capacity to the weak contaminating noise. We show that for a very large class of contaminating noise processes, explicit expressions for the sensitivity of a discrete-time channel capacity do exist. Sensitivity is shown to depend on the contaminating process distribution only through its autocorrelation function and so it coincides with the sensitivity with respect to a Gaussian contaminating noise with the same autocorrelation function. A key result is a formula for the derivative of the water-filling capacity with respect to the contaminating noise power. Parallel results for the sensitivity of rate-distortion function relative to a mean-square-error criterion of almost Gaussian processes are obtained.