Volodya Vovk
Royal Holloway, University of London
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Volodya Vovk.
european conference on machine learning | 2002
Harris Papadopoulos; Kostas Proedrou; Volodya Vovk; Alexander Gammerman
The existing methods of predicting with confidence give good accuracy and confidence values, but quite often are computationally inefficient. Some partial solutions have been suggested in the past. Both the original method and these solutions were based on transductive inference. In this paper we make a radical step of replacing transductive inference with inductive inference and define what we call the Inductive Confidence Machine (ICM); our main concern in this paper is the use of ICM in regression problems. The algorithm proposed in this paper is based on the Ridge Regression procedure (which is usually used for outputting bare predictions) and is much faster than the existing transductive techniques. The inductive approach described in this paper may be the only option available when dealing with large data sets.
Theoretical Computer Science | 2002
Alexander Gammerman; Volodya Vovk
This paper reviews some theoretical and experimental developments in building computable approximations of Kolmogorovs algorithmic notion of randomness. Based on these approximations a new set of machine learning algorithms have been developed that can be used not just to make predictions but also to estimate the confidence under the usual iid assumption.
international conference on tools with artificial intelligence | 2007
Harris Papadopoulos; Volodya Vovk; Alexander Gammerman
In this paper we study the ramification problem in the setting of temporal databases. Standard solutions from the literature on reasoning about action are inadequate because they rely on the assumptions that fluents persist and actions have effects on the subsequent situation only. We provide a solution based on an extension of the situation calculus and the work of McCain and Turner. More specifically, we study the case where there are conflicting effects of concurrently executing actions and we distinguish between hard and soft integrity constraints.Conformal prediction (CP) is a method that can be used for complementing the bare predictions produced by any traditional machine learning algorithm with measures of confidence. CP gives good accuracy and confidence values, but unfortunately it is quite computationally inefficient. This computational inefficiency problem becomes huge when CP is coupled with a method that requires long training times, such as neural networks. In this paper we use a modification of the original CP method, called inductive conformal prediction (ICP), which allows us to a neural network confidence predictor without the massive computational overhead of CP The method we propose accompanies its predictions with confidence measures that are useful in practice, while still preserving the computational efficiency of its underlying neural network.
algorithmic learning theory | 2000
Craig Saunders; Alexander Gammerman; Volodya Vovk
In this paper we propose a new algorithm for providing confidence and credibility values for predictions on a multi-class pattern recognition problem which uses Support Vector machines in its implementation. Previous algorithms which have been proposed to achieve this are very processing intensive and are only practical for small data sets. We present here a method which overcomes these limitations and can deal with larger data sets (such as the US Postal Service database). The measures of confidence and credibility given by the algorithm are shown empirically to reflect the quality of the predictions obtained by the algorithm, and are comparable to those given by the less computationally efficient method. In addition to this the overall performance of the algorithm is shown to be comparable to other techniques (such as standard Support Vector machines), which simply give flat predictions and do not provide the extra confidence/credibility measures.
artificial intelligence applications and innovations | 2009
Harris Papadopoulos; Alexander Gammerman; Volodya Vovk
Most current machine learning systems for medical decision support do not produce any indication of how reliable each of their predictions is. However, an indication of this kind is highly desirable especially in the medical field. This paper deals with this problem by applying a recently developed technique for assigning confidence measures to predictions, called conformal prediction, to the problem of acute abdominal pain diagnosis. The data used consist of a large number of hospital records of patients who suffered acute abdominal pain. Each record is described by 33 symptoms and is assigned to one of nine diagnostic groups. The proposed method is based on Neural Networks and for each patient it can produce either the most likely diagnosis together with an associated confidence measure, or the set of all possible diagnoses needed to satisfy a given level of confidence.
artificial intelligence applications and innovations | 2012
Ilia Nouretdinov; Dmitry Devetyarov; Brian Burford; Stephane Camuzeaux; Aleksandra Gentry-Maharaj; Ali Tiss; Celia Smith; Zhiyuan Luo; Alexey Ya. Chervonenkis; Rachel Hallett; Volodya Vovk; M D Waterfield; Rainer Cramer; John F. Timms; Ian Jacobs; Usha Menon; Alexander Gammerman
This paper describes the methodology of providing multiprobability predictions for proteomic mass spectrometry data. The methodology is based on a newly developed machine learning framework called Venn machines. They allow us to output a valid probability interval. We apply this methodology to mass spectrometry data sets in order to predict the diagnosis of heart disease and early diagnoses of ovarian cancer. The experiments show that probability intervals are valid and narrow. In addition, probability intervals were compared with the output of a corresponding probability predictor.
algorithmic learning theory | 2001
Yuri Kalnishkan; Michael V. Vyugin; Volodya Vovk
The paper introduces a way of re-constructing a loss function from predictive complexity. We show that a loss function and expectations of the corresponding predictive complexity w.r.t. the Bernoulli distribution are related through the Legendre transformation. It is shown that if two loss functions specify the same complexity then they are equivalent in a strong sense. The expectations are also related to the so-called generalized entropy.
international conference on machine learning | 1998
Craig Saunders; Alexander Gammerman; Volodya Vovk
uncertainty in artificial intelligence | 1998
Alexander Gammerman; Volodya Vovk; Vladimir Vapnik
international joint conference on artificial intelligence | 1999
Craig Saunders; Alexander Gammerman; Volodya Vovk