Manfred Opper
Aston University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Manfred Opper.
Neural Computation | 2002
Lehel Csató; Manfred Opper
We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the prediction of the GP model. By using an appealing parameterization and projection techniques in a reproducing kernel Hilbert space, recursions for the effective parameters and a sparse gaussian approximation of the posterior process are obtained. This allows for both a propagation of predictions and Bayesian error measures. The significance and robustness of our approach are demonstrated on a variety of experiments.We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The metho...
IEEE Transactions on Neural Networks | 2002
Robert D. Stewart; Iris Fermin; Manfred Opper
The seeded region growing (SRG) algorithm is a fast robust parameter-free method for segmenting intensity images given initial seed locations for each region. The requirement of predetermined seeds means that the model cannot operate fully autonomously. In this paper, we demonstrate a novel region growing variant of the pulse-coupled neural network (PCNN), which offers comparable performance to the SRG and is able to generate seed locations internally, opening the way to fully autonomous operation.
Physical Review Letters | 1999
Rainer Dietrich; Manfred Opper; Haim Sompolinsky
Using methods of Statistical Physics, we investigate the generalization performance of support vector machines (SVMs), which have been recently introduced as a general alternative to neural networks. For nonlinear classification rules, the generalization error saturates on a plateau, when the number of examples is too small to properly estimate the coefficients of the nonlinear part. When trained on simple rules, we find that SVMs overfit only weakly. The performance of SVMs is strongly enhanced, when the distribution of the inputs has a gap in feature space.
Journal of Computer and System Sciences | 2002
Yoav Freund; Manfred Opper
We combine the results of 13 and 8 and derive a continuous variant of a large class of drifting games. Our analysis furthers the understanding of the relationship between boosting, drifting games, and Brownian motion and yields a differential equation that describes the core of the problem.
Physical Review Letters | 2002
Dörthe Malzahn; Manfred Opper
Using a variational technique, we generalize the statistical physics approach of learning from random examples to make it applicable to real data. We demonstrate the validity and relevance of our method by computing approximate estimators for generalization errors that are based on training data alone.
international conference on artificial neural networks | 2001
Dörthe Malzahn; Manfred Opper
Based on a statistical mechanics approach, we develop a method for approximately computing average case learning curves and their sample fluctuations for Gaussian process regression models.We give examples for the Wiener process and show that universal relations (that are independent of the input distribution) between error measures can be derived.
Complexity | 2003
Dörthe Malzahn; Manfred Opper
We use the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based on Gaussian processes, we discuss Bootstrap estimates for learning curves.
international conference on artificial neural networks | 2001
Lehel Csató; Dan Cornford; Manfred Opper
We study online approximations to Gaussian process models for spatially distributedsystems. We apply our methodto the prediction of wind fields over the ocean surface from scatterometer data. Our approach combines a sequential update of a Gaussian approximation to the posterior with a sparse representation that allows to treat problems with a large number of observations.
Physica A-statistical Mechanics and Its Applications | 1993
Michael Biehl; Manfred Opper
Abstract An algorithm for the training of a special multilayered feed-forward neural network is presented. The strategy is very similar to the well-known tiling algorithm, yet the resulting architecture is completely different. Neurons are added in one layer only. The output of the network is given by the product of its k many hidden neurons, which is for ±1 units the result of the parity-operation. The capacity α c of a network trained according to the algorithm is estimated for the storage of randomly defined classifications. The asymptotic dependence is found to be α c ≈ k ln k for k → ∞. This is in agreement with recent analytic results for the algorithm-independent storage capacity of a parity-machine.
Physica A-statistical Mechanics and Its Applications | 2001
Manfred Opper; Ole Winther
We demonstrate for the case of single-layer neural networks how an extension of the TAP mean-field approach of disorder physics can be applied to the computation of approximate averages in probabilistic models for real data.