Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Heinrich Braun is active.

Publication


Featured researches published by Heinrich Braun.


international symposium on neural networks | 1993

A direct adaptive method for faster backpropagation learning: the RPROP algorithm

Martin A. Riedmiller; Heinrich Braun

A learning algorithm for multilayer feedforward networks, RPROP (resilient propagation), is proposed. To overcome the inherent disadvantages of pure gradient-descent, RPROP performs a local adaptation of the weight-updates according to the behavior of the error function. Contrary to other adaptive techniques, the effect of the RPROP adaptation process is not blurred by the unforeseeable influence of the size of the derivative, but only dependent on the temporal behavior of its sign. This leads to an efficient and transparent adaptation process. The capabilities of RPROP are shown in comparison to other adaptive techniques.<<ETX>>


parallel problem solving from nature | 1990

On Solving Travelling Salesman Problems by Genetic Algorithms

Heinrich Braun

We present a genetic algorithm for solving the traveling salesman problem by genetic algorithms to optimality for traveling salesman problems with up to 442 cities. Muhlenbein et al. [MGK 88], [MK 89] have proposed a genetic algorithm for the traveling salesman problem, which generates very good but not optimal solutions for traveling salesman problems with 442 and 531 cities. We have improved this approach by improving all basic components of that genetic algorithm. For our experimental investigations we used the traveling salesman problems TSP (i) with i cities for i=137, 202, 229, 318, 431, 442, 666 which were solved to optimality in [CP 80], [GH 89].


Archive | 1993

Evolving Neural Feedforward Networks

Heinrich Braun; Joachim Weisbrod

For many practical problem domains the use of neural networks has led to very satisfactory results. Nevertheless the choice of an appropriate, problem specific network architecture still remains a very poorly understood task. Given an actual problem, one can choose a few different architectures, train the chosen architectures a few times and finally select the architecture with the best behaviour. But, of course, there may exist totally different and much more suited topologies. In this paper we present a genetic algorithm driven network generator that evolves neural feedforward network architectures for specific problems. Our system ENZO1 optimizes both the network topology and the connection weights at the same time, thereby saving an order of magnitude in necessary learning time. Together with our new concept to solve the crucial neural network problem of permuted internal representations this approach provides an efficient and successfull crossover operator. This makes ENZO very appropriate to manage the large networks needed in application oriented domains. In experiments with three different applications our system generated very successful networks. The generated topologies possess distinct improvements referring to network size, learning time, and generalization ability.


parallel problem solving from nature | 1994

ENZO-M - A Hybrid Approach for Optimizing Neural Networks by Evolution and Learning

Heinrich Braun; Peter Zagorski

ENZO-M combines two successful search techniques using two different timescales: learning (gradient descent) for finetuning of each offspring and evolution for coarse optimization steps of the network topology. Therefore, our evolutionary algorithm is a metaheuristic based on the best available local heuristic. Through training each offspring by fast gradient methods the search space of our evolutionary algorithm is considerably reduced to the set of local optima.


Archive | 1996

Interactive Activation and Competition

Heinrich Braun; Johannes Feulner; Rainer Malaka

Das Netzmodell Interactive Activation and Competition (IAC) geht auf McClelland und Rumelhart zuruck. Eine herausragende Anwendung von IAC ist die Modellierung der menschlichen Fahigkeit geschriebene Worter schneller zu erkennen als einzelne Buchstaben, und bekannte Worter schneller als unbekannte. IAC-Netze konnen jedoch nicht aus Beispielen lernen. Die Netztopologie und auch die Verbindungsgewichte zwischen den Neuronen mussen „von Hand“eingestellt werden.


Archive | 1995

Massively Parallel Training of Multi Layer Perceptrons With Irregular Topologies

D. Koll; Martin A. Riedmiller; Heinrich Braun

In this paper we present an approach to the training of feed forward neural networks on massively parallel SIMD-architectures. In order to cover a wide field of applications we focus our attention on the flexibility of the load balancing routines. Our approach is characterized by three important properties: 1. All four types of parallelism inherent in the training phase are used. 2. In a preprocessing step neural networks are transformed into equivalent topologies, more suited for parallel computation. 3. Each learning task can be parallelized in a number of different ways, the best of which is chosen according to estimations of the computing efficiency.


world congress on computational intelligence | 1994

ENZO-II-a powerful design tool to evolve multilayer feed forward networks

Heinrich Braun; Peter Zagorski

ENZO-II combines two successful search techniques: gradient descent for an efficient local weight optimization and evolution for a global topology optimization. By using these, it takes full advantage of the efficiently computable gradient information without being trapped by local minima. Through knowledge transfer by inheriting parental weights, learning is speeded up by 1-2 orders of magnitude, and the expected fitness of the offspring is far above the average for this network topology. Moreover, ENZO-II impressively thins out the topology by the cooperation between a discrete mutation operator and a continuous weight decay method. Especially, ENZO-II also tries to cut off the connections to possibly redundant input units. Therefore, ENZO-II not only supports the user in the network design but also recognizes redundant input units.<<ETX>>


Archive | 2000

Evolution — A Paradigm for Constructing Intelligent Agents

Heinrich Braun

Attractive titles bear the danger to promise more than the text will hold. In order to clarify my intentions I start the introduction by describing what the text will not describe. First of all, I will not claim that any artificial evolution can indeed construct intelligent agents. Of course, there seems to be a constructive proof, that in biology evolution can construct intelligent agents, e.g., human beings. But the situation is comparable to that concerning the state of the art of the paradigm of neural networks. There also seems to be a constructive proof, that in biology natural neural networks can implement intelligence, but no artificial neural network has gained the common acknowledgment of being truly intelligent. Therefore we neglect the term intelligent in the following considerations, although it clearly remains our ultimate goal to construct intelligent agents.


Archive | 1996

Das symmetrische Hopfield-Modell

Heinrich Braun; Johannes Feulner; Rainer Malaka

J. J. Hopfield schlug 1982 ein sehr popular gewordenes Modell eines neuronalen Assoziativspeichers vor. Dabei soll folgende Aufgabe gelost werden: In einem Netz sollen p vorgegebene Muster (N-stellige Binarvektoren uber {+1, -1}) gespeichert und wieder abgerufen werden. Dies soll auf fehlertolerante Art geschehen, d.h. ein verrauschtes oder unvollstandiges Muster soll immer moglichst gut wiedererkannt werden.


Archive | 1996

Optimieren mit neuronalen Netzen

Heinrich Braun; Johannes Feulner; Rainer Malaka

Beim Optimieren mit neuronalen Netzen werden Probleme so in „Energie“-Funktionen kodiert, das deren Minimum gerade einer optimalen Losung entspricht. Solche Energiefunktionen haben wir bereits beim symmetrischen Hopfield-Modell kennengelernt. Beim Optimieren mit neuronalen Netzen soll jetzt versucht werden, anstelle von Mustern die Losungen von Optimierungsproblemen in die lokalen Minima der Energiefunktion eines neuronalen Netzes zu legen. Das hier verwendete Neuronenmodell ist ein vereinfachtes Hopfield-Tank-Modell. Das ursprungliche Modell wurde als analoges elektronisches Netz von J. J. Hopfield und D. W. Tank vorgeschlagen.

Collaboration


Dive into the Heinrich Braun's collaboration.

Top Co-Authors

Avatar

Johannes Feulner

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Rainer Malaka

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Zagorski

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

D. Koll

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Joachim Weisbrod

Karlsruhe Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge