Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Vladimir V. V'yugin is active.

Publication


Featured researches published by Vladimir V. V'yugin.


Theoretical Computer Science | 1998

Ergodic theorems for individual random sequences

Vladimir V. V'yugin

In the framework of the Kolmogorovs approach to the substantiation of the probability theory and information theory on the base of the theory of algorithms we try to formulate probabilistic laws, i.e. statements of the form P{ω¦A(ω)} = 1, where A(ω) is some formula, in “pointwise” form “if ω is random then A(ω) holds”. Nevertheless, not all proofs of such laws can be directly translated into the algorithmic form. In [11] two examples have been distinguished — Birkhoff s [2] ergodic theorem and Shannon-McMillan-Breiman theorem of information theory [1]. In this paper an analysis of algorithmic effectiveness of these theorems is given. We prove that Birkhoffs ergodic theorem is indeed in some strong sense “nonconstructive”. At the same time the claim to formulate probabilistic laws for algorithmically random sequences is not so restrictive. We present the versions of these laws for individual random sequences.


The Computer Journal | 1999

Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences

Vladimir V. V'yugin

This paper is a survey of concepts and results related to simple Kolmogorov complexity, prefix complexity and resource bounded complexity. We consider also a new type of complexity statistical complexity closely related to mathematical statistics. Unlike other discoverers of algorithmic complexity A.N.Kolmogorov’s leading motive was developing on its basis a mathematical theory more adequately substantiating applications of the probability theory, mathematical statistics and information theory. Kolmogorov wanted to deduce properties of random object from its complexity characteristics without use the notion of probability. In the first part of this paper we present several results in this direction. Though the following development of algorithmic complexity and randomness was different algorithmic complexity has successful applications in the traditional probabilistic framework. The second part of the paper is a survey of applications to parameters estimation and definition of Bernoulli sequences. All considerations have finite combinatorial character.


Theoretical Computer Science | 1998

Non-stochastic infinite and finite sequences

Vladimir V. V'yugin

Abstract Combining outcomes of coin-tossing and transducer algorithms it is possible to generate with probability close to 1 very pathological sequences for which computable probabilistic forecasting is impossible. These sequences are not random with respect to any reasonable probability distribution. A natural consequence from the definition of such sequences is that each simple measure of the set of all such sequences is equal to 0. It was Kolmogorovs and Levins idea to estimate the probability of generating of such sequences in the combinations of probabilistic and algorithmic processes [8, 14, 21]. We collect several results in this direction for infinite sequences and asymptotic results for finite sequences including estimation of space and time of losing randomness for time bounded forecasting systems (a correction to [22]).


Problems of Information Transmission | 2005

Concentration Theorems for Entropy and Free Energy

Vladimir V. V'yugin; V. P. Maslov

AbstractJaynes’s entropy concentration theorem states that, for most words ω1 ...ωN of length N such that


algorithmic learning theory | 2003

Transductive Confidence Machine Is Universal

Ilia Nouretdinov; Vladimir V. V'yugin; Alexander Gammerman


Problems of Information Transmission | 2001

Nonrobustness Property of the Individual Ergodic Theorem

Vladimir V. V'yugin

\mathop \Sigma \limits_{i = 1}^{\rm N} \;f(\omega _i ) \approx vN


Information & Computation | 2002

Suboptimal Measures of Predictive Complexity for Absolute Loss Function

Vladimir V. V'yugin


Problems of Information Transmission | 2003

Problems of Robustness for Universal Coding Schemes

Vladimir V. V'yugin

, empirical frequencies of values of a function f are close to the probabilities that maximize the Shannon entropy given a value v of the mathematical expectation of f. Using the notion of algorithmic entropy, we define the notions of entropy for the Bose and Fermi statistical models of unordered data. New variants of Jaynes’s concentration theorem for these models are proved. We also present some concentration properties for free energy in the case of a nonisolated isothermal system. Exact relations for the algorithmic entropy and free energy at extreme points are obtained. These relations are used to obtain tight bounds on uctuations of energy levels at equilibrium points.


Problems of Information Transmission | 2003

Extremal Relations between Additive Loss Functions and the Kolmogorov Complexity

Vladimir V. V'yugin; V. P. Maslov

Vovk’s Transductive Confidence Machine (TCM) is a practical prediction algorithm giving, in additions to its predictions, confidence information valid under the general iid assumption. The main result of this paper is that the prediction method used by TCM is universal under a natural definition of what “valid” means: any prediction algorithm providing valid confidence information can be replaced, without losing much of its predictive performance, by a TCM. We use as the main tool for our analysis the Kolmogorov theory of complexity and algorithmic randomness.


Information & Computation | 1996

Bayesianism: An Algorithmic Analysis

Vladimir V. V'yugin

Main laws of probability theory, when applied to individual sequences, have a “robustness” property under small violations of randomness. For example, the law of large numbers for the symmetric Bernoulli scheme holds for a sequence where the randomness deficiency of its initial fragment of length n grows as o(n). The law of iterated logarithm holds if the randomness deficiency grows as o(log log n). We prove that Birkhoffs individual ergodic theorem is nonrobust in this sense. If the randomness deficiency grows arbitrarily slowly on initial fragments of an infinite sequence, this theorem can be violated. An analogous nonrobustness property holds for the Shannon–McMillan–Breiman theorem.

Collaboration


Dive into the Vladimir V. V'yugin's collaboration.

Top Co-Authors

Avatar

Vladimir G. Trunov

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge