Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Willshaw is active.

Publication


Featured researches published by David Willshaw.


Proceedings of the Royal Society of London. Series B, Biological sciences | 1976

How patterned neural connections can be set up by self-organization

David Willshaw; C. von der Malsburg

An important problem in biology is to explain how patterned neural connections are set up during ontogenesis. Topographically ordered mappings, found widely in nervous systems, are those in which neighbouring elements in one sheet of cells project to neighbouring elements in a second sheet. Exploiting this neighbourhood property leads to a new theory for the establishment of topographical mappings, in which the distance between two cells is expressed in terms of their similarity with respect to certain physical properties assigned to them. This topographical code can be realized in a model employing either synchronization of nervous activity or exchange of specific molecules between neighbouring cells. By means of modifiable synapses the code is used to set up a topographical mapping between two sheets with the same internal structure. We have investigated the neural activity version. Without needing to make any elaborate assumptions about its structure or about the operations its elements are to carry out we have shown that the mappings are set up in a system-to-system rather than a cell-to-cell fashion. The pattern of connections develops in a step-by-step and orderly fashion, the orientation of the mappings being laid down in the earliest stages of development.


Neural Computation | 1990

Optimal plasticity from matrix memories: What goes up must come down

David Willshaw; Peter Dayan

A recent article (Stanton and Sejnowski 1989) on long-term synaptic depression in the hippocampus has reopened the issue of the computational efficiency of particular synaptic learning rules (Hebb 1949; Palm 1988a; Morris and Willshaw 1989) homosynaptic versus heterosynaptic and monotonic versus nonmonotonic changes in synaptic efficacy. We have addressed these questions by calculating and maximizing the signal-to-noise ratio, a measure of the potential fidelity of recall, in a class of associative matrix memories. Up to a multiplicative constant, there are three optimal rules, each providing for synaptic depression such that positive and negative changes in synaptic efficacy balance out. For one rule, which is found to be the Stent-Singer rule (Stent 1973; Rauschecker and Singer 1979), the depression is purely heterosynaptic; for another (Stanton and Sejnowski 1989), the depression is purely homosynaptic; for the third, which is a generalization of the first two, and has a higher signal-to-noise ratio, it is both heterosynaptic and homosynaptic. The third rule takes the form of a covariance rule (Sejnowski 1977a,b) and includes, as a special case, the prescription due to Hopfield (1982) and others (Willshaw 1971; Kohonen 1972).


Proceedings of the Royal Society of London. Series B, Biological sciences | 1975

On a role for competition in the formation of patterned neural connexions.

M. C. Prestige; David Willshaw

There exist in certain nervous systems topological mappings of one set of spatially ordered nerve cells onto another, whose formation has not yet been satisfactorily explained. In this paper the role that competition may play in the development of such spatially patterned nervous connexions is discussed, and simple models incorporating competition and graded affinity between axons and postsynaptic sites have been tested by computer simulation. It is shown that the conditions for map making by this mechanism are that there is not only competition between axons for postsynaptic sites but also between postsynaptic sites for axons; and that this can be effected by imposing saturation conditions limiting the number of postsynaptic sites an axon may contact simultaneously and similarly the number of axon branches that can contact a postsynaptic cell. The possibility was investigated that competition models might lead to spreading or compression of the pattern of connexions in spatial mismatch experiments, such as have been done on the visual system of lower vertebrates. Two classes of these experiments exist: (a) those in which no size disparity is present between the set of presynaptic branches and the set of postsynaptic sites (for example, Xenopus the compound eye experiments); competition models predict an obligatory spreading of connexions without postulating regulation; (b) those in which a size disparity does exist (for example, a complete retina being made to regenerate into a half tectum, or vice versa); competition models predict spreading-compression only if the number of possible presynaptic branches available for contact formation is once more made equal to the number of available sites. The interpretation of other experimental designs is discussed. The importance of the establishment of transient or temporary contacts during development is emphasized.


Adaptive Behavior | 1999

Evolving Swimming Controllers for a Simulated Lamprey with Inspiration from Neurobiology

Auke Jan Ijspeert; John Hallam; David Willshaw

This paper presents how neural swimming controllers for a simulated lamprey can be developed using evolutionary algorithms. A genetic algorithm is used for evolving the architecture of a connectionist model which determines the muscular activity of a simulated body in interaction with water. This work is inspired by the biological model developed by Ekeberg which repro duces the central pattern generator observed in the real lamprey (Ekeberg, 1993). In evolving artificial controllers, we demonstrate that a genetic algorithm can be an interesting design tech nique for neural controllers and that there exist alternative solutions to the biological connectiv ity. A variety of neural controllers are evolved which can produce the pattern of oscillations necessary for swimming. These patterns can be modulated through the external excitation ap plied to the network in order to vary the speed and the direction of swimming. The best evolved controllers cover larger ranges of frequencies, phase lags and speeds of swimming than Ekebergs model. We also show that the same techniques for evolving artificial solutions can be interesting tools for developing neurobiological models. In particular, biologically plausible controllers can be developed with ranges of oscillation frequency much closer to those observed in the real lamprey than Ekebergs hand-crafted model.


Network: Computation In Neural Systems | 1990

Application of the elastic net algorithm to the formation of ocular dominance stripes

Geoffrey J. Goodhill; David Willshaw

The elastic net algorithm, an iterative technique for the solution of combinatorial optimisation problems that have a geometric interpretation, was applied to the problem of explaining the development of ocular dominance stripes in the vertebrate visual system. Simulations show that this algorithm produces stripes under certain conditions. Analysis is presented that predicts the moment at which stripes form and an expression is derived for how stripe width depends on the parameters of the system. In contrast to most other models for stripe formation, the elastic net algorithm provides a common explanatory framework for the development of stripes and of retinotopically ordered projections.


Biological Cybernetics | 1991

Optimising synaptic learning rules in linear associative memories

Peter Dayan; David Willshaw

Associative matrix memories with real-valued synapses have been studied in many incarnations. We consider how the signal/noise ratio for associations depends on the form of the learning rule, and we show that a covariance rule is optimal. Two other rules, which have been suggested in the neurobiology literature, are asymptotically optimal in the limit of sparse coding. The results appear to contradict a line of reasoning particularly prevalent in the physics community. It turns out that the apparent conflict is due to the adoption of different underlying models. Ironically, they perform identically at their co-incident optima. We give details of the mathematical results, and discuss some other possible derivations and definitions of the signal/noise ratio.


Journal of Integrative Neuroscience | 2002

NEUROINFORMATICS: THE INTEGRATION OF SHARED DATABASES AND TOOLS TOWARDS INTEGRATIVE NEUROSCIENCE

Shun-Ichi Amari; Francesco Beltrame; Jan G. Bjaalie; Turgay Dalkara; Erik De Schutter; Gary F. Egan; Nigel Goddard; Carmen Gonzalez; Sten Grillner; Andreas V. M. Herz; Peter Hoffmann; Iiro Jaaskelainen; Stephen H. Koslow; Soo-Young Lee; Perry L. Miller; Fernando Mira da Silva; Mirko Novak; Viji Ravindranath; Raphael Ritz; Ulla Ruotsalainen; Shankar Subramaniam; Yiyuan Tang; Arthur W. Toga; Shiro Usui; Jaap van Pelt; Paul F. M. J. Verschure; David Willshaw; Andrzej Wróbel

There is significant interest amongst neuroscientists in sharing neuroscience data and analytical tools. The exchange of neuroscience data and tools between groups affords the opportunity to differently re-analyze previously collected data, encourage new neuroscience interpretations and foster otherwise uninitiated collaborations, and provide a framework for the further development of theoretically based models of brain function. Data sharing will ultimately reduce experimental and analytical error. Many small Internet accessible database initiatives have been developed and specialized analytical software and modeling tools are distributed within different fields of neuroscience. However, in addition large-scale international collaborations are required which involve new mechanisms of coordination and funding. Provided sufficient government support is given to such international initiatives, sharing of neuroscience data and tools can play a pivotal role in human brain research and lead to innovations in neuroscience, informatics and treatment of brain disorders. These innovations will enable application of theoretical modeling techniques to enhance our understanding of the integrative aspects of neuroscience. This article, authored by a multinational working group on neuroinformatics established by the Organization for Economic Co-operation and Development (OECD), articulates some of the challenges and lessons learned to date in efforts to achieve international collaborative neuroscience.


Archive | 2001

Emergent Neural Computational Architectures based on Neuroscience

Stefan Wermter; Jim Austin; David Willshaw

This book is the result of a series of International Workshops organised bythe EmerNet project on Emergent Neural Computational Architectures basedon Neuroscience sponsored by the Engineering and Physical Sciences ResearchCouncil (EPSRC). The overall aim of the book is to present a broad spectrum ofcurrent research into biologically inspired computational systems and hence en-courage the emergence of new computational approaches based on neuroscience.It is generally understood that the present approaches for computing do not havethe performance, exibility and reliability of biological information processingsystems. Although there is a massive body of knowledge regarding how process-ing occurs in the brain and central nervous system this has had little impact onmainstream computing so far.The process of developing biologically inspired computerised systems involvesthe examination of the functionality and architecture of the brain with an empha-sis on the information processing activities. Biologically inspired computerisedsystems address neural computation from the position of both neuroscience,and computing by using experimental evidence to create general neuroscience-inspired systems.The book focuses on the main research areas of modular organisation androbustness, timing and synchronisation, and learning and memory storage. Theissues considered as part of these include: How can the modularity in the brainbe used to produce large scale computational architectures? How does the hu-man memory manage to continue to operate despite failure of its components?How does the brain synchronise its processing? How does the brain computewith relatively slow computing elements but still achieve rapid and real-timeperformance? How can we build computational models of these processes andarchitectures? How can we design incremental learning algorithms and dynamicmemory architectures? How can the natural information processing systems beexploited for arti cial computational methods?We hope that this book stimulates and encourages new research in this area.We would like to thank all contributors to this book and the few hundred partici-pants of the various workshops. Especially we would like to express our thanks toMark Elshaw, network assistant in the EmerNet network who put in tremendouse ort during the process of publishing this book.Finally, we would like to thank EPSRC and James Fleming for their supportand Alfred Hofmann and his sta at Springer for their continuing assistance.March 2001Stefan WermterJim AustinDavid Willshaw


Network: Computation In Neural Systems | 1993

On setting unit thresholds in an incompletely connected associative net

Jay T. Buckingham; David Willshaw

The associative net is a matrix model of associative memory which has an efficiency of recall approaching that of a random access memory with no associative capability. This is a fully connected network, which makes it possible to use a simple strategy for setting the thresholds of the units in recall. However, most brain structures that are thought to underlie learning and memory have only partial interconnectivity. We describe five different strategies for setting the thresholds of units in partially connected nets. The simplest is a mechanism of the winners-take-all type. The most sophisticated strategy employs information about the density of modified synapses and their distribution on each output unit. The action of this mechanism is shown to be equivalent to minimization of output error but without requiring numerical solution of a set of equations, which would be biologically implausible. Simulation results demonstrate the superiority of this mechanism in a typical case. Parameter sensitivity analy...


The Journal of Neuroscience | 2009

Early-Stage Waves in the Retinal Network Emerge Close to a Critical State Transition between Local and Global Functional Connectivity

Matthias H. Hennig; Christopher Adams; David Willshaw; Evelyne Sernagor

A novel, biophysically realistic model for early-stage, acetylcholine-mediated retinal waves is presented. In this model, neural excitability is regulated through a slow after-hyperpolarization (sAHP) operating on two different temporal scales. As a result, the simulated network exhibits competition between a desynchronizing effect of spontaneous, cell-intrinsic bursts, and the synchronizing effect of synaptic transmission during retinal waves. Cell-intrinsic bursts decouple the retinal network through activation of the sAHP current, and we show that the network is capable of operating at a transition point between purely local and global functional connectedness, which corresponds to a percolation phase transition. Multielectrode array recordings show that, at this point, the properties of retinal waves are reliably predicted by the model. These results indicate that early spontaneous activity in the developing retina is regulated according to a very specific principle, which maximizes randomness and variability in the resulting activity patterns.

Collaboration


Dive into the David Willshaw's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John Hallam

University of Southern Denmark

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Volker Steuber

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge