R. Erichsen
Universidade Federal do Rio Grande do Sul
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by R. Erichsen.
international conference on artificial neural networks | 2002
David Dominguez; Elka Korutcheva; W. K. Theumann; R. Erichsen
The macroscopic dynamics of an extremely diluted threestate neural network based on mutual information and mean-field theory arguments is studied in order to establish the stability of the stationary states. Results are presented in terms of the pattern-recognition overlap, the neural activity, and the activity-overlap. It is shown that the presence of synaptic noise is essential for the stability of states that recognize only the active patterns when the full structure of the patterns is not recognizable. Basins of attraction of considerable size are obtained in all cases for a not too large storage ratio of patterns.
Physica A-statistical Mechanics and Its Applications | 2002
M. S. Mainieri; R. Erichsen
We discuss, in this paper, the dynamical properties of extremely diluted, non-monotonic neural networks. Assuming parallel updating and the Hebb prescription for the synaptic connections, a flow equation for the macroscopic overlap is derived. A rich dynamical phase diagram was obtained showing a stable retrieval phase, as well as a cycle two and chaotic behavior. Numerical simulations were performed, showing good agreement with analytical results. Furthermore, the simulations give an additional insight into the microscopic dynamical behavior during the chaotic phase. It is shown that the freezing of individual neuron states is related to the structure of chaotic attractors.
Physica A-statistical Mechanics and Its Applications | 2004
W. K. Theumann; R. Erichsen
The dynamics and the stationary states of an exactly solvable three-state layered feed-forward neural network model with asymmetric synaptic connections, finite dilution and low pattern activity are studied in extension of a recent work on a recurrent network. Detailed phase diagrams are obtained for the stationary states and for the time evolution of the retrieval overlap with a single pattern. It is shown that in spite of instabilities for low thresholds there is a gradual improvement in network performance with increasing threshold up to an optimal stage. The robustness to synaptic noise is checked and the effects of dilution and of variable threshold on the information content of the network are also established.
PHYSICS, COMPUTATION, AND THE MIND - ADVANCES AND CHALLENGES AT INTERFACES: Proceedings of the 12th Granada Seminar on Computational and Statistical Physics | 2013
Beatriz E. P. Mizusaki; Everton J. Agnes; Leonardo Gregory Brunnet; R. Erichsen
The synaptic plasticity rules that sculpt a neural network architecture are key elements to understand cortical processing, as they may explain the emergence of stable, functional activity, while avoiding runaway excitation. For an associative memory framework, they should be built in a way as to enable the network to reproduce a robust spatio-temporal trajectory in response to an external stimulus. Still, how these rules may be implemented in recurrent networks and the way they relate to their capacity of pattern recognition remains unclear. We studied the effects of three phenomenological unsupervised rules in sparsely connected recurrent networks for associative memory: spike-timing-dependent-plasticity, short-term-plasticity and an homeostatic scaling. The system stability is monitored during the learning process of the network, as the mean firing rate converges to a value determined by the homeostatic scaling. Afterwards, it is possible to measure the recovery efficiency of the activity following eac...
international conference on artificial neural networks | 2012
Everton J. Agnes; R. Erichsen; Leonardo Gregory Brunnet
A synaptic architecture featuring both excitatory and inhibitory neurons is assembled aiming to build up an associative memory system. The connections follow a hebbian-like rule. The network activity is analyzed using a multidimensional reduction method, Principal Component Analysis (PCA), applied to neuron firing rates. The patterns are discriminated and recognized by well defined paths that emerge within PCA subspaces, one for each pattern. Detailed comparisons among these subspaces are used to evaluate the network storage capacity. We show a transition from a retrieval to a non-retrieval regime as the number of stored patterns increases. When gap junctions are implemented together with the chemical synapses, this transition is shifted and a larger number of memories is associated to the network.
Journal of Statistical Mechanics: Theory and Experiment | 2015
Fabio Schittler Neves; Benno Martim Schubert; R. Erichsen
Layered neural networks are feedforward structures that yield robust parallel and distributed pattern recognition. Even though much attention has been paid to pattern retrieval properties in such systems, many aspects of their dynamics are not yet well characterized or understood. In this work we study, at different temperatures, the memory activity and information flows through layered networks in which the elements are the simplest binary odd non-monotonic function. Our results show that, considering a standard Hebbian learning approach, the network information content has its maximum always at the monotonic limit, even though the maximum memory capacity can be found at non-monotonic values for small enough temperatures. Furthermore, we show that such systems exhibit rich macroscopic dynamics, including not only fixed point solutions of its iterative map, but also cyclic and chaotic attractors that also carry information.
PHYSICS, COMPUTATION, AND THE MIND - ADVANCES AND CHALLENGES AT INTERFACES: Proceedings of the 12th Granada Seminar on Computational and Statistical Physics | 2013
Leonardo Gregory Brunnet; Everton J. Agnes; Beatriz E. P. Mizusaki; R. Erichsen
Different areas of the brain are involved in specific aspects of the information being processed both in learning and in memory formation. For example, the hippocampus is important in the consolidation of information from short-term memory to long-term memory, while emotional memory seems to be dealt by the amygdala. On the microscopic scale the underlying structures in these areas differ in the kind of neurons involved, in their connectivity, or in their clustering degree but, at this level, learning and memory are attributed to neuronal synapses mediated by longterm potentiation and long-term depression. In this work we explore the properties of a short range synaptic connection network, a nearest neighbor lattice composed mostly by excitatory neurons and a fraction of inhibitory ones. The mechanism of synaptic modification responsible for the emergence of memory is Spike-Timing-Dependent Plasticity (STDP), a Hebbian-like rule, where potentiation/depression is acquired when causal/non-causal spikes happen in a synapse involving two neurons. The system is intended to store and recognize memories associated to spatial external inputs presented as simple geometrical forms. The synaptic modifications are continuously applied to excitatory connections, including a homeostasis rule and STDP. In this work we explore the different scenarios under which a network with short range connections can accomplish the task of storing and recognizing simple connected patterns.
COOPERATIVE BEHAVIOR IN NEURAL SYSTEMS: Ninth Granada Lectures | 2007
R. Erichsen; M. S. Mainieri; Leonardo Gregory Brunnet
The Hindmarsh-Rose model of neurons is a model that describes the essential of the spiking activity of biological neurons. Tn this work we present an exploratory numerical study of the time activities of two HR neurons interacting through electrical synapses. The knowledge of this simple system is a first step towards the understanding of the cooperative behavior of large neural assemblies. Several periodic and chaotic attractors were identified, as the coupling strength is increased from zero till the perfect synchronization regime. In addition to the known phase locking synchronization at weak coupling, electrical synapses also allow for both inphase and anti-phase synchronization from moderate to strong coupling. A regime where the system changes apparently randomly between in-phase and anti-phase locking evolves to a bistability regime, where both in-phase and anti-phase periodic attractors are locally stable. At the strong coupling regime in-phase chaotic evolution dominates, but windows with complex periodic behavior are also present.
Physica A-statistical Mechanics and Its Applications | 2006
Désiré Bollé; R. Erichsen; Toni Verbeiren
The Q=3-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. The appearance and properties of two-cycles are discussed. Capacity–temperature phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. It is found that the asymptotic behaviour is rather similar to that for sequential updating. The retrieval region is enhanced marginally, but the spin-glass region is visibly enlarged. Only the presence of self-coupling can enlarge the retrieval region substantially. The dynamics of the network is studied for general Q and both synchronous and sequential updating using an extension of the generating function technique. The differences with the signal-to-noise approach are outlined. Typical flow diagrams for the Q=3 overlap order parameter are presented.
Physica A-statistical Mechanics and Its Applications | 2004
Désiré Bollé; R. Erichsen; W. K. Theumann
The time evolution of an exactly solvable layered feedforward neural network with three-state neurons and optimizing the mutual information is studied for arbitrary synaptic noise (temperature). Detailed stationary temperature-capacity and capacity–activity phase diagrams are obtained. The model exhibits pattern retrieval, pattern-fluctuation retrieval and spin-glass phases. It is found that there is an improved performance in the form of both a larger critical capacity and information content compared with three-state Ising-type layered network models. Flow diagrams reveal that saddle-point solutions associated with fluctuation overlaps slow down considerably the flow of the network states towards the stable fixed points.