Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anna Levina is active.

Publication


Featured researches published by Anna Levina.


Nature Physics | 2011

Impact of Single Links in Competitive Percolation -- How complex networks grow under competition

Jan Nagler; Anna Levina; Marc Timme

1 Bernstein Center for Computational Neuroscience, 37073 Göttingen, Germany 2 Department of Solar Energy, Institute for Solid State Physics, ISFH / University of Hannover, 30167 Hannover, Germany 3 Network Dynamics Group, Max Planck Institute for Dynamics & Self-Organization, 37073 Göttingen, Germany 4Faculty of Physics,University of Göttingen, 37077 Göttingen, Germany Emails: [email protected], [email protected], [email protected], [email protected] nature of the percolation transition—how links add to a system until it is extensively connected—crucially underlies the structure and function of virtually all growing complex networks. Percolation transitions have long been thought to be continuous, but recent numerical work suggests that certain percolating systems exhibit discontinuous phase transitions. This study explains the key microscopic mechanisms underlying such ‘explosive percolation’.


PLOS Computational Biology | 2015

Self-organization in Balanced State Networks by STDP and Homeostatic Plasticity

Felix Effenberger; Jürgen Jost; Anna Levina

Structural inhomogeneities in synaptic efficacies have a strong impact on population response dynamics of cortical networks and are believed to play an important role in their functioning. However, little is known about how such inhomogeneities could evolve by means of synaptic plasticity. Here we present an adaptive model of a balanced neuronal network that combines two different types of plasticity, STDP and synaptic scaling. The plasticity rules yield both long-tailed distributions of synaptic weights and firing rates. Simultaneously, a highly connected subnetwork of driver neurons with strong synapses emerges. Coincident spiking activity of several driver cells can evoke population bursts and driver cells have similar dynamical properties as leader neurons found experimentally. Our model allows us to observe the delicate interplay between structural and dynamical properties of the emergent inhomogeneities. It is simple, robust to parameter changes and able to explain a multitude of different experimental findings in one basic network.


Frontiers in Computational Neuroscience | 2013

Critical dynamics in associative memory networks

Maximilian Uhlig; Anna Levina; Theo Geisel; J. Michael Herrmann

Critical behavior in neural networks is characterized by scale-free avalanche size distributions and can be explained by self-regulatory mechanisms. Theoretical and experimental evidence indicates that information storage capacity reaches its maximum in the critical regime. We study the effect of structural connectivity formed by Hebbian learning on the criticality of network dynamics. The network only endowed with Hebbian learning does not allow for simultaneous information storage and criticality. However, the critical regime can be stabilized by short-term synaptic dynamics in the form of synaptic depression and facilitation or, alternatively, by homeostatic adaptation of the synaptic weights. We show that a heterogeneous distribution of maximal synaptic strengths does not preclude criticality if the Hebbian learning is alternated with periods of critical dynamics recovery. We discuss the relevance of these findings for the flexibility of memory in aging and with respect to the recent theory of synaptic plasticity.


Stochastics and Dynamics | 2014

The Abelian distribution

Anna Levina; J. Michael Herrmann

We define the Abelian distribution and study its basic properties. Abelian distributions arise in the context of neural modeling and describe the size of neural avalanches in fully-connected integrate-and-fire models of self-organized criticality in neural systems.


BMC Neuroscience | 2013

Self-organized criticality in structured neural networks

Maximilian Uhlig; Anna Levina; Theo Geisel; Michael Herrmann

Critical dynamics in neural networks is an experimentally and conceptually established phenomenon which has been shown to be important for information processing in the brain. Critical neural networks have been shown to have optimal computational capabilities, information transmission and capacity [1,2]. At the same time the theoretical understanding of neural avalanches has been developed starting from sandpile-like system and homogeneous networks towards structured networks. The network connectivity has been chosen, however, as to support or even to enable criticality. There are, nevertheless, many influences that shape the connectivity structure and weighting. Most prominently, this includes Hebbian learning and homeostatic effects, but also pathological changes. We study how the structural changes affect the presence of criticality in the networks. While homeostatic plasticity may well have a regulatory effect that supports criticality, this cannot been said about Hebbian learning which essentially imprints structure from internally or externally caused activation patterns in the synaptic weighting of the network increasing thus the probability of previous patterns to reoccur. Unless the patterns are carefully chosen to produce critical behavior, these effects have a tendency to counteract critical behavior, e.g. by introducing a particular scale that corrupts the power-law distributions characteristic for critical behavior. Little is known, in particular, about the influence of criticality on associative memory neural networks. We found that the critical regime is can be stabilized by short-term synaptic dynamics in the form of synaptic depression and facilitation that was already shown to play an important role in the self-organization of critical neural dynamics [3] or, alternatively, by homeostatic adaptation of the synaptic weights. We show that a heterogeneous distribution of maximal synaptic strengths does not preclude criticality if the Hebbian learning is alternated with periods of critical dynamics recovery. Figure 1 A: Retrieval performance for networks with dynamical synapses and subject to Hebbian learning for different load parameters α. Shown is the average overlap between stored patterns and the corresponding retrieved patterns. Dashed lines indicate ...


BMC Neuroscience | 2011

Neural dynamics and network topology interact to form critical avalanches.

Anna Levina; J. Michael Herrmann; Theo Geisel

Self-organized criticality (SOC) is one of the key concepts for describing the emergence of complexity in nature. In neural systems, the critical state is believed to optimize memory capacity, sensitivity to stimuli and information transmission. Critical avalanches were found in cortical cultures and slices [1] and in the motor cortex of awake monkeys [2]. Computational models of SOC often include an explicit regulatory mechanism that guides the state of the network toward criticality. We have shown previously [3,4] that synaptic facilitation and depression are sufficient to explain the self-organization of critical behavior in a network of non-leaky neurons. This model lead to the prediction of an activity-dependent switching mechanism for up and down state dynamics in prefrontal cortex [4]. Models of neural avalanches may include mechanisms on neural, synaptic or network level. In the present contribution we propose a generalized model that combines short-term synaptic dynamics with homeostatic effects that are controlled on the neural level as well as long-term plasticity that causes change in the network structure. We show that the interaction of these effects is indeed constructive in the sense that the critical state of the network is stably maintained. We also studied how criticality influences learning in neural networks and vice versa how the network can maintain criticality in face of a changing topology. While e.g. strong random dilution of the connectivity may initially induce a subcritical behavior, criticality is quickly reinstalled by a local learning rule that, in this case, affects essentially only the synaptic rescaling. The learning rule is homeostatic: it aims to stabilize the postsynaptic response to the spiking activity of the neuron. More complex network structures are either fully compatible with SOC (such as small-world topologies that were found in critical network reconstruction [5]) or require a substantial reorganization of the network. The latter is observed e.g. in networks with nearest neighbor connections that do not attain criticality for any synaptic strength, but are transformed into a critical network by growth of a small number of additional connections. The adaptation rules that bring about SOC are also compatible with other types of learning e.g. for the formation of memories. Here again, homeostatic learning compensates the structural effects of learning in the system. This also allows us to study how STDP shapes the critical network by self-organization. We have thus provided a framework that represents a mechanism of SOC in a general sense and that can be used for testing the impact of various neurotransmitters, for the integration of senso-motoric loops and consolidation of memories. The essential interaction in multi-level learning extends past results that were based on the functional independence of neural and network dynamics.


BMC Neuroscience | 2009

Are age-related cognitive effects caused by optimization?

Hecke Schrobsdorff; Matthias Ihrke; Jörg Behrendt; J. Michael Herrmann; Theo Geisel; Anna Levina

Introduction Cognitive aging seems to be a process of global degradation. Performance in psychological tests of fluid intelligence, such as Ravens Advanced Progressive Matrices, tends to decrease with age [1]. These results are strongly contrasted by performance improvements in everyday situations [2]. We therefore hypothesize that the observed aging deficits are partly caused by the optimization of cognitive functions due to learning.


BMC Neuroscience | 2013

On the influence of inhibitory STDP on balanced state random networks

Felix Effenberger; Anna Levina; Jürgen Jost

The distribution of synaptic efficacies in neural networks takes fundamental influence on their dynamics and the modification of synaptic strengths forms the foundation of learning and memory. A prominent plasticity rule that has been observed in vitro is spike-timing-dependent plasticity (STDP). While first studied in glutamatergic synapses, recently also STDP of GABAergic synapses came into the focus of experimental and theoretical research [1]. We study random balanced state networks of leaky integrate-and-fire neurons in the asynchronous irregular (AI) regime [2] that is believed to be a good theoretical fit to the activity of cortical networks in vivo. We consider driven networks that receive Poisson input as well as networks in a self-sustained state of activity. In order to assess the influence of excitatory and inhibitory STDP on the network dynamics, we introduce these two plasticity rules independently, observing network dynamics and weight distributions after a transient phase. Note that both additive and multiplicative STDP rules yield the same network dynamics as described below. When introducing excitatory STDP alone, parameters involving the maximal weight have to be fine-tuned in order to keep the network activity stably in the AI regime [3]. For almost all parameter values the network activity becomes unstable, leaving the AI regime and settling in a pathological, highly synchronized state with saturated firing rates of most cells, see Figure ​Figure1A.1A. We also observed that even without STDP, few strong excitatory connections can substantially destabilize network dynamics yielding pathological states. Interestingly, this destabilization does not happen when in addition to excitatory STDP we also introduce STDP for inhibitory synapses projecting onto excitatory cells. The latter setup results in a network that stably rests in the AI regime, see Figure ​Figure1A.1A. Both STDP rules yield near-Gaussian distributions of synaptic weights, see Figure ​Figure1B.1B. Inhibitory STDP even manages to stabilize a network that was brought to a pathological state by excitatory STDP, see Figure ​Figure1A.1A. This clearly shows that inhibitory STDP has a stabilizing effect on network dynamics and we expect that especially in combination with synaptic scaling and in the context of clustered networks [4] other non-trivial dynamical effects will become visible. Figure 1 A. Raster plot of 30 randomly sampled cells showing network activity. Red line: activation of excitatory STDP, green line: activation of inhibitory STDP. B. Weight distributions of plastic synapses converging onto 100 randomly sampled excitatory neurons. ...


Nature Physics | 2007

Dynamical synapses causing self-organized criticality in neural networks

Anna Levina; J.M. Herrmann; Theo Geisel


Physical Review Letters | 2009

Phase Transitions towards Criticality in a Neural System with Adaptive Interactions

Anna Levina; J. Michael Herrmann; Theo Geisel

Collaboration


Dive into the Anna Levina's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge