Sven Buchholz
University of Kiel
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sven Buchholz.
International Journal of Neural Systems | 2008
Sven Buchholz; Nicolas Le Bihan
For polarized signals, which arise in many application fields, a statistical framework in terms of quaternionic random processes is proposed. Based on it, the ability of real-, complex- and quaternionic-valued multi-layer perceptrons (MLPs) of performing classification tasks for such signals is evaluated. For the multi-dimensional neural networks the relevance of class label representations is discussed. For signal to noise separation it is shown that the quaternionic MLP yields an optimal solution. Results on the classification of two different polarized signals are also reported.
Neural Networks | 2008
Sven Buchholz; Gerald Sommer
We study the framework of Clifford algebra for the design of neural architectures capable of processing different geometric entities. The benefits of this model-based computation over standard real-valued networks are demonstrated. One particular example thereof is the new class of so-called Spinor Clifford neurons. The paper provides a sound theoretical basis to Clifford neural computation. For that purpose the new concepts of isomorphic neurons and isomorphic representations are introduced. A unified training rule for Clifford MLPs is also provided. The topic of activation functions for Clifford MLPs is discussed in detail for all two-dimensional Clifford algebras for the first time.
international symposium on neural networks | 2000
Sven Buchholz; Gerald Sommer
We present a novel MLP-type neural network based on hyperbolic numbers
international symposium on neural networks | 2008
Tohru Nitta; Sven Buchholz
the hyperbolic multilayer perceptron (HMLP). The neurons of the HMLP compute 2D-hyperbolic orthogonal transformations as weight propagation functions. The HMLP can therefore be seen as the hyperbolic counterpart of the known complex MLP. The HMLP is proven to be a universal approximator. Furthermore, a suitable backpropagation algorithm for it is derived. It is shown by experiments that the HMLP can learn tasks with underlying hyperbolic properties much more accurately and efficiently than a complex MLP and an ordinary MLP.
international conference on artificial neural networks | 2007
Sven Buchholz; Kanta Tachibana; Eckhard Hitzer
In this paper, the basic properties, especially decision boundary, of the hyperbolic neurons used in the hyperbolic neural networks are investigated. And also, a non-split hyperbolic sigmoid activation function is proposed.
international symposium on neural networks | 1996
E. Bayro Corrochano; Sven Buchholz; Gerald Sommer
Neural computation in Clifford algebras, which include familiar complex numbers and quaternions as special cases, has recently become an active research field. As always, neurons are the atoms of computation. The paper provides a general notion for the Hessian matrix of Clifford neurons of an arbitrary algebra. This new result on the dynamics of Clifford neurons then allows the computation of optimal learning rates. A thorough discussion of error surfaces together with simulation results for different neurons is also provided. The presented contents should give rise to very efficient second-order training methods for Clifford Multilayer perceptrons in the future.
IWMM'04/GIAE'04 Proceedings of the 6th international conference on Computer Algebra and Geometric Algebra with Applications | 2004
Sven Buchholz; Gerald Sommer
This paper presents a novel self-organizing type RBF neural network and introduces the geometric algebra in the neural computing field. Real valued neural nets for function approximation require feature enhancement, dilation and rotation operations and are limited by the Euclidean metric. This coordinate-free geometric framework allows to process patterns between layers in a particular dimension and desired metric being possible only due to the promising projective split. The potential of such nets working in a Clifford algebra C(V/sub p,q/) is shown by a simple application of frame coordination in robotics.
Geometric computing with Clifford algebras | 2001
Sven Buchholz; Gerald Sommer
Averaging measured data is an important issue in computer vision and robotics. Integrating the pose of an object measured with multiple cameras into a single mean pose is one such example. In many applications data does not belong to a vector space. Instead, data often belongs to a non-linear group manifold as it is the case for orientation data and the group of three-dimensional rotations SO(3). Averaging on the manifold requires the utilization of the associated Riemannian metric resulting in a rather complicated task. Therefore the Euclidean mean with best orthogonal projection is often used as an approximation. In SO(3) this can be done by rotation matrices or quaternions. Clifford algebra as a generalization of quaternions allows a general treatment of such approximated averaging for all classical groups. Results for the two-dimensional Lorentz group SO(1,2) and the related groups SL(2,ℝ) and SU(1,1) are presented. The advantage of the proposed Clifford framework lies in its compactness and easiness of use.
Lecture Notes in Computer Science | 1997
Eduardo Bayro-Corrochano; Sven Buchholz
This is the first of two chapters on neural computation in Clifford algebra. The name Clifford algebra refers to its inventor William K. Clifford (1845–1879). We will restrict ourselves on Clifford algebras generated by non-degenerate quadratic forms. Thus, Clifford algebras are non-degenerated geometric algebras hereafter.
international symposium on neural networks | 2008
Minh Tuan Pham; Kanta Tachibana; Eckhard Hitzer; Sven Buchholz; Tomohiro Yoshikawa; Takeshi Furuhashi
The representation of the external world in biological creatures appears to be defined in terms of geometry. This suggests that researchers should look for suitable mathematical systems with powerful geometric and algebraic characteristics. In such mathematical context the design and implementation of neural networks will be certainly more advantageous. This paper presents the generalization of feedforward neural networks in the Clifford or geometric algebra framework. The efficiency of the geometric neural nets indicate a step forward in the design of algorithms for multidimensional artificial learning.