Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert Urbanczik is active.

Publication


Featured researches published by Robert Urbanczik.


Physical Review E | 2000

Online learning with ensembles.

Robert Urbanczik

Supervised online learning with an ensemble of students randomized by the choice of initial conditions is analyzed. For the case of the perceptron learning rule, asymptotically the same improvement in the generalization error of the ensemble compared to the performance of a single student is found as in Gibbs learning. For more optimized learning rules, however, using an ensemble yields no improvement. This is explained by showing that for any learning rule f a transform f exists, such that a single student using f has the same generalization behavior as an ensemble of f students.


European Physical Journal B | 1999

Statistical physics and practical training of soft-committee machines

Martin Ahr; Michael Biehl; Robert Urbanczik

Abstract:Equilibrium states of large layered neural networks with differentiable activation function and a single, linear output unit are investigated using the replica formalism. The quenched free energy of a student network with a very large number of hidden units learning a rule of perfectly matching complexity is calculated analytically. The system undergoes a first order phase transition from unspecialized to specialized student configurations at a critical size of the training set. Computer simulations of learning by stochastic gradient descent from a fixed training set demonstrate that the equilibrium results describe quantitatively the plateau states which occur in practical training procedures at sufficiently small but finite learning rates.


Physical Review E | 2003

Learning curves for mutual information maximization.

Robert Urbanczik

An unsupervised learning procedure based on maximizing the mutual information between the outputs of two networks receiving different but statistically dependent inputs is analyzed [S. Becker and G. Hinton, Nature (London) 355, 161 (1992)]. For a generic data model, I show that in the large sample limit the structure in the data is recognized by mutual information maximization. For a more restricted model, where the networks are similar to perceptrons, I calculate the learning curves for zero-temperature Gibbs learning. These show that convergence can be rather slow, and a way of regularizing the procedure is considered.


Physica A-statistical Mechanics and Its Applications | 2001

Training multilayer perceptrons by principal component analysis

Michael Biehl; Christoph Bunzmann; Robert Urbanczik

We present a training algorithm for multilayer perceptrons which relates to the technique of principal component analysis. The latter is performed with respect to a correlation matrix which is computed from the example inputs and their target outputs. For large networks the novel procedure requires far fewer examples for good generalization than traditional on-line algorithms.


Journal of Physics A | 1999

Noisy regression and classification with continuous multilayer networks

Martin Ahr; Michael Biehl; Robert Urbanczik

We investigate zero-temperature Gibbs learning for two classes of unrealizable rules which play an important role in practical applications of multilayer neural networks with differentiable activation functions: classification problems and noisy regression problems. Considering one step of replica symmetry breaking, we surprisingly find that for sufficiently large training sets the stable state is replica symmetric even though the target rule is unrealizable. Furthermore, the classification problem is shown to be formally equivalent to the noisy regression problem.


Physical Review Letters | 2001

Efficiently learning multilayer perceptrons.

Christoph Bunzmann; Michael Biehl; Robert Urbanczik


Physical Review E | 2005

Efficient training of multilayer perceptrons using principal component analysis

Christoph Bunzmann; Michael Biehl; Robert Urbanczik


the european symposium on artificial neural networks | 2002

Supervised Learning in Committee Machines by PCA

Christoph Bunzmann; Michael Biehl; Robert Urbanczik


Journal of Physics A | 1999

LETTER TO THE EDITOR: Noisy regression and classification with continuous multilayer networks

Martin Ahr; Michael Biehl; Robert Urbanczik


Journal of Physics A | 1998

LETTER TO THE EDITOR: On-line learning in a discrete state space

Wolfgang Kinzel; Robert Urbanczik

Collaboration


Dive into the Robert Urbanczik's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge