Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael V. Vyugin is active.

Publication


Featured researches published by Michael V. Vyugin.


algorithmic learning theory | 2001

Loss functions, complexities, and the legendre transformation

Yuri Kalnishkan; Michael V. Vyugin; Volodya Vovk

The paper introduces a way of re-constructing a loss function from predictive complexity. We show that a loss function and expectations of the corresponding predictive complexity w.r.t. the Bernoulli distribution are related through the Legendre transformation. It is shown that if two loss functions specify the same complexity then they are equivalent in a strong sense. The expectations are also related to the so-called generalized entropy.


conference on learning theory | 2002

Mixability and the Existence of Weak Complexities

Yuri Kalnishkan; Michael V. Vyugin

This paper investigates the behaviour of the constant c(s) from the Aggregating Algorithm. Some conditions for mixability are derived and it is shown that for many non-mixable games c(s) still converges to 1. The condition c(s) ? 1 is shown to imply the existence of weak predictive complexity and it is proved that many games specify complexity up to ?n.


conference on learning theory | 2007

Generalised entropy and asymptotic complexities of languages

Yuri Kalnishkan; Vladimir Vovk; Michael V. Vyugin

In this paper the concept of asymptotic complexity of languages is introduced. This concept formalises the notion of learnability in a particular environment and generalises Lutz and Fortnows concepts of predictability and dimension. Then asymptotic complexities in different prediction environments are compared by describing the set of all pairs of asymptotic complexities w.r.t. different environments. A geometric characterisation in terms of generalised entropies is obtained and thus the results of Lutz and Fortnow are generalised.


algorithmic learning theory | 2004

A Criterion for the Existence of Predictive Complexity for Binary Games

Yuri Kalnishkan; Vladimir Vovk; Michael V. Vyugin

It is well known that there exists a universal (i.e., optimal to within an additive constant if allowed to work infinitely long) algorithm for lossless data compression (Kolmogorov, Levin). The game of lossless compression is an example of an on-line prediction game; for some other on-line prediction games (such as the simple prediction game) a universal algorithm is known not to exist. In this paper we give an analytic characterisation of those binary on-line prediction games for which a universal prediction algorithm exists.


algorithmic learning theory | 2002

On the Absence of Predictive Complexity for Some Games

Yuri Kalnishkan; Michael V. Vyugin

This paper shows that if the curvature of the boundary of the set of superpredictions for a game vanishes in a nontrivial way, then there is no predictive complexity for the game. This is the first result concerning the absence of complexity for games with convex sets of superpredictions. The proof is further employed to show that for some games there are no certain variants of weak predictive complexity. In the case of the absolute-loss game we reach a tight demarcation between the existing and non-existing variants of weak predictive complexity.


Information & Computation | 2014

Generalised entropies and asymptotic complexities of languages

Yuri Kalnishkan; Michael V. Vyugin; Vladimir Vovk

The paper explores connections between asymptotic complexity and generalised entropy. Asymptotic complexity of a language (a language is a set of finite or infinite strings) is a way of formalising the complexity of predicting the next element in a sequence: it is the loss per element of a strategy asymptotically optimal for that language. Generalised entropy extends Shannon entropy to arbitrary loss functions; it is the optimal expected loss given a distribution on possible outcomes. It turns out that the set of tuples of asymptotic complexities of a language w.r.t. different loss functions can be described by means of the generalised entropies corresponding to the loss functions.


european conference on computational learning theory | 2001

Pattern Recognition and Density Estimation under the General i.i.d. Assumption

Ilia Nouretdinov; Volodya Vovk; Michael V. Vyugin; Alexander Gammerman


Lecture Notes in Computer Science | 2005

The weak aggregating algorithm and weak mixability

Yuri Kalnishkan; Michael V. Vyugin


algorithmic learning theory | 2001

Non-linear Inequalities between Predictive and Kolmogorov Complexities

Michael V. Vyugin; Vladimir V. V'yugin


Electronic Colloquium on Computational Complexity | 2004

Non-reducible descriptions for conditional Kolmogorov complexity

Andrei A. Muchnik; Alexander Shen; Nikolai K. Vereshchagin; Michael V. Vyugin

Collaboration


Dive into the Michael V. Vyugin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge