Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anders Lansner is active.

Publication


Featured researches published by Anders Lansner.


Journal of Computational Neuroscience | 2007

Simulation of networks of spiking neurons: A review of tools and strategies

Romain Brette; Michelle Rudolph; Ted Carnevale; Michael L. Hines; David Beeman; James M. Bower; Markus Diesmann; Abigail Morrison; Philip H. Goodman; Frederick C. Harris; Milind Zirpe; Thomas Natschläger; Dejan Pecevski; Bard Ermentrout; Mikael Djurfeldt; Anders Lansner; Olivier Rochel; Thierry Viéville; Eilif Muller; Andrew P. Davison; Sami El Boustani; Alain Destexhe

We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.


European Journal of Clinical Pharmacology | 1998

A Bayesian neural network method for adverse drug reaction signal generation

Andrew Bate; Marie Lindquist; Ivor Ralph Edwards; Sten Olsson; Roland Orre; Anders Lansner; R.M. De Freitas

AbstractObjective: The database of adverse drug reactions (ADRs) held by the Uppsala Monitoring Centre on behalf of the 47 countries of the World Health Organization (WHO) Collaborating Programme for International Drug Monitoring contains nearly two million reports. It is the largest database of this sort in the world, and about 35 000 new reports are added quarterly. The task of trying to find new drug–ADR signals has been carried out by an expert panel, but with such a large volume of material the task is daunting. We have developed a flexible, automated procedure to find new signals with known probability difference from the background data. Method: Data mining, using various computational approaches, has been applied in a variety of disciplines. A Bayesian confidence propagation neural network (BCPNN) has been developed which can manage large data sets, is robust in handling incomplete data, and may be used with complex variables. Using information theory, such a tool is ideal for finding drug–ADR combinations with other variables, which are highly associated compared to the generality of the stored data, or a section of the stored data. The method is transparent for easy checking and flexible for different kinds of search. Results: Using the BCPNN, some time scan examples are given which show the power of the technique to find signals early (captopril–coughing) and to avoid false positives where a common drug and ADRs occur in the database (digoxin–acne; digoxin–rash). A routine application of the BCPNN to a quarterly update is also tested, showing that 1004 suspected drug–ADR combinations reached the 97.5% confidence level of difference from the generality. Of these, 307 were potentially serious ADRs, and of these 53 related to new drugs. Twelve of the latter were not recorded in the CD editions of The physicians Desk Reference orMartindales Extra Pharmacopoea and did not appear in Reactions Weekly online. Conclusion: The results indicate that the BCPNN can be used in the detection of significant signals from the data set of the WHO Programme on International Drug Monitoring. The BCPNN will be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs.


Trends in Neurosciences | 1995

Neural networks that co-ordinate locomotion and body orientation in lamprey

Sten Grillner; T. Deliagina; A. El Manira; Russell H. Hill; G. N. Orlovsky; Peter Wallén; Örjan Ekeberg; Anders Lansner

The networks of the brainstem and spinal cord that co-ordinate locomotion and body orientation in lamprey are described. The cycle-to-cycle pattern generation of these networks is produced by interacting glutamatergic and glycinergic neurones, with NMDA receptor-channels playing an important role at lower rates of locomotion. The fine tuning of the networks produced by 5-HT, dopamine and GABA systems involves a modulation of Ca2+-dependent K+ channels, high- and low-threshold voltage-activated Ca2+ channels and presynaptic inhibitory mechanisms. Mathematical modelling has been used to explore the capacity of these biological networks. The vestibular control of the body orientation during swimming is exerted via reticulospinal neurones located in different reticular nuclei. These neurones become activated maximally at different angles of tilt.


Brain Research Reviews | 1998

Intrinsic function of a neuronal network — a vertebrate central pattern generator

Sten Grillner; Örjan Ekeberg; Abdeljabbar El Manira; Anders Lansner; David Parker; Jesper Tegnér; Peter Wallén

The cellular bases of vertebrate locomotor behaviour is reviewed using the lamprey as a model system. Forebrain and brainstem cell populations initiate locomotor activity via reticulospinal fibers activating a spinal network comprised of glutamatergic and glycinergic interneurons. The role of different subtypes of Ca2+ channels, Ca2+ dependent K+ channels and voltage dependent NMDA channels at the neuronal and network level is in focus as well as the effects of different metabotropic, aminergic and peptidergic modulators that target these ion channels. This is one of the few vertebrate networks that is understood at a cellular level.


Biological Cybernetics | 1991

A computer based model for realistic simulations of neural networks

Örjan Ekeberg; Peter Wallén; Anders Lansner; Hans Tråvén; Lennart Brodin; Sten Grillner

The use of computer simulations as a neurophysiological tool creates new possibilities to understand complex systems and to test whether a given model can explain experimental findings. Simulations, however, require a detailed specification of the model, including the nerve cell action potential and synaptic transmission. We describe a neuron model of intermediate complexity, with a small number of compartments representing the soma and the dendritic tree, and equipped with Na+, K+, Ca2+, and Ca2+ dependent K+ channels. Conductance changes in the different compartments are used to model conventional excitatory and inhibitory synaptic interactions. Voltage dependent NMDA-receptor channels are also included, and influence both the electrical conductance and the inflow of Ca2+ ions. This neuron model has been designed for the analysis of neural networks and specifically for the simulation of the network generating locomotion in a simple vertebrate, the lamprey. By assigning experimentally established properties to the simulated cells and their synapses, it has been possible to verify the sufficiency of these properties to account for a number of experimental findings of the network in operation. The model is, however, sufficiently general to be useful for realistic simulation also of other neural systems.


Biological Cybernetics | 1992

Computer simulation of the segmental neural network generating locomotion in lamprey by using populations of network interneurons

Jeanette Hellgren; Sten Grillner; Anders Lansner

Realistic computer simulations of the experimentally established local spinal cord neural network generating swimming in the lamprey have been performed. Populations of network interneurons were used in which cellular properties, like cell size and membrane conductance including voltage dependent ion channels were randomly distributed around experimentally obtained mean values, as were synaptic conductances (kainate/AMPA, NMDA, glycine) and delays. This population model displayed more robust burst activity over a wider frequency range than the more simple subsample model used previously, and the pattern of interneuronal activity was appropriate. The strength of the reciprocal inhibition played a very important role in the regulation of burst frequency, and just by changing the inhibitory bias the entire physiological range could be covered. At the lower frequency range of bursting the segmental excitatory interneurons provide stability as does the activation of voltage dependent NMDA receptors. Spike frequency adaptation by means of summation of afterhyperpolarization (AHP) serves as a major burst terminating factor, and at lower rates the membrane properties conferred by the NMDA receptor activation. The lateral interneurons were not of critical importance for the burst termination. They may, however, be of particular importance for inducing a rapid burst termination during for instance steering and righting reactions. Several cellular factors combine to provide a secure and stable motor pattern in the entire frequency range.


Neural Networks | 2007

Towards cortex sized artificial neural systems

Christopher Johansson; Anders Lansner

We propose, implement, and discuss an abstract model of the mammalian neocortex. This model is instantiated with a sparse recurrently connected neural network that has spiking leaky integrator units and continuous Hebbian learning. First we study the structure, modularization, and size of neocortex, and then we describe a generic computational model of the cortical circuitry. A characterizing feature of the model is that it is based on the modularization of neocortex into hypercolumns and minicolumns. Both a floating- and fixed-point arithmetic implementation of the model are presented along with simulation results. We conclude that an implementation on a cluster computer is not communication but computation bounded. A mouse and rat cortex sized version of our model executes in 44% and 23% of real-time respectively. Further, an instance of the model with 1.6 x 10(6) units and 2 x 10(11) connections performed noise reduction and pattern completion. These implementations represent the current frontier of large-scale abstract neural network simulations in terms of network size and running speed.


Frontiers in Neuroinformatics | 2008

Large-Scale Modeling – a Tool for Conquering the Complexity of the Brain

Mikael Djurfeldt; Örjan Ekeberg; Anders Lansner

Is there any hope of achieving a thorough understanding of higher functions such as perception, memory, thought and emotion or is the stunning complexity of the brain a barrier which will limit such efforts for the foreseeable future? In this perspective we discuss methods to handle complexity, approaches to model building, and point to detailed large-scale models as a new contribution to the toolbox of the computational neuroscientist. We elucidate some aspects which distinguishes large-scale models and some of the technological challenges which they entail.


Neuron | 2015

Neurocognitive Architecture of Working Memory

Johan Eriksson; Edward K. Vogel; Anders Lansner; Fredrik Bergström; Lars Nyberg

A crucial role for working memory in temporary information processing and guidance of complex behavior has been recognized for many decades. There is emerging consensus that working-memory maintenance results from the interactions among long-term memory representations and basic processes, including attention, that are instantiated as reentrant loops between frontal and posterior cortical areas, as well as sub-cortical structures. The nature of such interactions can account for capacity limitations, lifespan changes, and restricted transfer after working-memory training. Recent data and models indicate that working memory may also be based on synaptic plasticity and that working memory can operate on non-consciously perceived information.


Ibm Journal of Research and Development | 2008

Brain-scale simulation of the neocortex on the IBM Blue Gene/L supercomputer

Mikael Djurfeldt; Mikael Lundqvist; Christopher Johansson; Martin Rehn; Örjan Ekeberg; Anders Lansner

Biologically detailed large-scale models of the brain can now be simulated thanks to increasingly powerful massively parallel supercomputers. We present an overview, for the general technical reader, of a neuronal network model of layers II/III of the neocortex built with biophysical model neurons. These simulations, carried out on an IBM Blue Gene/L™ supercomputer, comprise up to 22 million neurons and 11 billion synapses, which makes them the largest simulations of this type ever performed. Such model sizes correspond to the cortex of a small mammal. The SPLIT library, used for these simulations, runs on single-processor as well as massively parallel machines. Performance measurements show good scaling behavior on the Blue Gene/L supercomputer up to 8,192 processors. Several key phenomena seen in the living brain appear as emergent phenomena in the simulations. We discuss the role of this kind of model in neuroscience and note that full-scale models may be necessary to preserve natural dynamics. We also discuss the need for software tools for the specification of models as well as for analysis and visualization of output data. Combining models that range from abstract connectionist type to biophysically detailed will help us unravel the basic principles underlying neocortical function.

Collaboration


Dive into the Anders Lansner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Örjan Ekeberg

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Christopher Johansson

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Pawel Herman

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mikael Lundqvist

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Simon Benjaminsson

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Erik Fransén

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge