Hans-Ulrich Bauer
Goethe University Frankfurt
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Hans-Ulrich Bauer.
IEEE Transactions on Neural Networks | 1992
Hans-Ulrich Bauer; Klaus Pawelzik
It is shown that a topographic product P, first introduced in nonlinear dynamics, is an appropriate measure of the preservation or violation of neighborhood relations. It is sensitive to large-scale violations of the neighborhood ordering, but does not account for neighborhood ordering distortions caused by varying areal magnification factors. A vanishing value of the topographic product indicates a perfect neighborhood preservation; negative (positive) values indicate a too small (too large) output space dimensionality. In a simple example of maps from a 2D input space onto 1D, 2D, and 3D output spaces, it is demonstrated how the topographic product picks the correct output space dimensionality. In a second example, 19D speech data are mapped onto various output spaces and it is found that a 3D output space (instead of 2D) seems to be optimally suited to the data. This is an agreement with a recent speech recognition experiment on the same data set.
international symposium on physical design | 1993
Hans-Ulrich Bauer; Klaus Pawelzik
Abstract In recent neurophysiological experiments stimulus-related neuronal oscillations were discovered in various species. The oscillations are not persistent during the whole time of stimulation, but instead seem to be restricted to rather short periods, interrupted by stochastic periods. In this contribution we argue, that these observations can be explained by a bistability in the ensemble dynamics of coupled integrate and fire neurons. This dynamics can be cast in terms of a high-dimensional map for the time evolution of a phase density which represents the ensemble state. A numerical analysis of this map reveals the coexistence of two stationary states in a broad parameter regime when the synaptic transmission is nonlinear. The one state corresponds to a stochastic firing of individual neurons, the other state describes a periodic activation. We demonstrate that under the influence of additional external noise the system can switch between these states, in this way reproducing the experimentally activity. We also investigate the connection between the nonlinearity of the synaptic transmission function and the bistability of the dynamics. To this purpose we heuristically reduce the high-dimensional assembly dynamics to a one-dimensional map, which in turn yields a simple explanation for the relation between nonlinearity and bistability in our system.
Network: Computation In Neural Systems | 1993
Josef Deppisch; Hans-Ulrich Bauer; Thomas B. Schillen; Peter König; Klaus Pawelzik; Theo Geisel
We focus on a phenomenon observed in cat visual cortex, namely the alternation of oscillatory and irregular neuronal activity. This aspect of the dynamics has been neglected in brain modelling, but it may be essential for the dynamic binding of different neuronal assemblies. The authors present a simple, but physiologically plausible model network which exhibits such a behaviour in spite of its simplicity—e.g. dendritic dynamics is neglected—as an emergent network property. It comprises a number of spiking neurons which are interconnected in a mutually excitatory way. Each neuron is stimulated by several stochastic spike trains. The resulting large input variance is shown to be important for the response properties of the network, which they characterize in terms of two parameters of the autocorrelation function: the frequency and the modulation amplitude. They calculate these parameters as functions of the internal coupling strength, the external input strength and several input connectivity schemes and ...
Biological Cybernetics | 1994
Fred Wolf; Hans-Ulrich Bauer; Theo Geisel
The representations of visual hemifields in the extrastriate areas of various species exhibit field discontinuities and islands. We propose that these violations of retinotopy are a developmental consequence of the elongated shape of the respective cortical areas. To substantiate this claim, we investigated a model of activity-driven map formation. In agreement with observations, this model yields maps with field discontinuities if the cortical areas exceed a threshold elongation. Moreover, within the same model island representations in the periphery and the area centralis can also be understood. A multistability of the solutions in the model gives a very simple explanation for the observed interindividual variability of maps in cats. The model leads to a prediction of the radial dependence of the areal magnification factor near field discontinuities, which could be accessible for a high precision mapping experiment.
Physics Letters A | 1991
J. Deppisch; Hans-Ulrich Bauer; Theo Geisel
Abstract We present a new procedure for hierarchical training of multilayer perceptrons to outputs of high precision. It achieves a dramatic increase in accuracy, e.g. by three orders of magnitude, and can reduce training time considerably. The method is applied to the prediction of chaotic systems where we obtain the optimum error evolution for iterated predictions as well as a substantial reduction of the absolute prediction error.
Biological Cybernetics | 1996
Maximilian Riesenhuber; Hans-Ulrich Bauer; Theo Geisel
Abstract.The self-organizing map (SOM), a widely used algorithm for the unsupervised learning of neural maps, can be formulated in a low-dimensional ‘feature map’ variant which requires prespecified parameters (‘features’) for the description of receptive fields, or in a more general high-dimensional variant which allows self-organization of the structure of individual receptive fields as well as their arrangement in a map. We present here a new analytical method for deriving conditions for the emergence of structure in SOMs which is particularly suited for the as yet inaccessible high-dimensional SOM variant. Our approach is based on an evaluation of a map distortion function. It involves only an ansatz for the way stimuli are distributed among map neurons; the receptive fields of the map need not be known explicitly. Using this method we first calculate regions of stability for four possible states of SOMs projecting from a rectangular input space to a ring of neurons. We then analyze the transition from nonoriented to oriented receptive fields in a SOM-based model for the development of orientation maps. In both cases, the analytical results are well corroborated by the results of computer simulations.
international symposium on neural networks | 1990
Hans-Ulrich Bauer; Theo Geisel
Neural networks for such perceptual tasks as speech recognition must provide even more invariances than nets dealing with static problems, e.g., invariance under presentation speed fluctuations. The authors presently show that multilayer perceptrons with feedback over several layers (FMLPs) can meet these requirements. FMLPs can be trained simply with the open-loop learning rule. An analytical criterion for the stability of the resulting feedback states is given. By optimizing the output pattern representation, the stability of the feedback states can be improved. In the same way, the basins of attraction of the stable states can be enlarged. The performance with respect to presentation speed fluctuations is demonstrated in an example using three coupled FMLPs. In a continuous input sequence of letters, online detection of words is achieved. even when the presentation speed fluctuates in a wide range
Archive | 2000
Hans-Ulrich Bauer; Jochen Braun
The visual perception of objects is thought to occur in several steps. After the extraction of local features in a first stage, these features have to be reintegrated for a provisional parsing of the scene into several candidate objects (visual segmentation). The reintegration phase involves processes like perceptual grouping of local stimuli according to Gestalt laws as well as the binding of local features across different feature channels. After segmentation of the scene, a single candidate object is selected for recognition, memorization, coordination of motoric action etc. (visual selection). Visual selection is inherently serial and is believed to be accomplished by shifts of visual attention.
international conference on artificial neural networks | 1992
J. Deppisch; Hans-Ulrich Bauer; Thomas B. Schillen; Peter König; Klaus Pawelzik; Theo Geisel
Switching between oscillatory and stochastic states in single electrode signals from cat visual cortex is an experimental phenomenon which had not been included in recent models on cortical oscillations. Here, we present a model network of spiking neurons which exhibits such alternating responses as an emergent network property. Simulations of this model reveal a detailed agreement between numerical results and experimental observations.
Archive | 1990
Hans-Ulrich Bauer; Theo Geisel
Neural networks for speech recognition must cope with several invariance requirements. These include invariance with respect to presentation speed fluctuations as well as tolerance towards coarticulation of patterns. We show that multilayer perceptrons with feedback from the output to the input layer meet these requirements. They can be trained simply, their feedback states can be made stable, and they are automatically robust with respect to presentation speed fluctuations. The coarticulation performance is expressed by the detection probability for gradual transitions between patterns. If this performance is not satisfactory anyhow, it can be improved by extending the learning set. These properties make feedback multilayer perceptrons promising candidates for speech recognition.