Lee Calcraft
University of Hertfordshire
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lee Calcraft.
Connection Science | 2006
Neil Davey; Lee Calcraft; Rod Adams
Models of associative memory usually have full connectivity or, if diluted, random symmetric connectivity. In contrast, biological neural systems have predominantly local, non-symmetric connectivity. Here we investigate sparse networks of threshold units, trained with the perceptron learning rule. The units are given position and are arranged in a ring. The connectivity graph varies between being local to random via a small world regime, with short path lengths between any two neurons. The connectivity may be symmetric or non-symmetric. The results show that it is the small world networks with non-symmetric weights and non-symmetric connectivity that perform best as associative memories. It is also shown that in highly dilute networks small world architectures will produce efficiently wired associative memories, which still exhibit good pattern completion abilities.
Connection Science | 2007
Lee Calcraft; Rod Adams; Neil Davey
In physical implementations of associative memory, wiring costs play a significant role in shaping patterns of connectivity. In this study of sparsely-connected associative memory, a range of architectures is explored in search of optimal connection strategies that maximize pattern-completion performance, while at the same time minimizing wiring costs. It is found that architectures in which the probability of connection between any two nodes is based on relatively tight Gaussian and exponential distributions perform well and that for optimum performance, the width of the Gaussian distribution should be made proportional to the number of connections per node. It is also established from a study of other connection strategies that distal connections are not necessary for good pattern-completion performance. Convergence times and network scalability are also addressed in this study.
Frontiers in Computational Neuroscience | 2011
Weiliang Chen; Reinoud Maex; Rod Adams; Volker Steuber; Lee Calcraft; Neil Davey
The problem we address in this paper is that of finding effective and parsimonious patterns of connectivity in sparse associative memories. This problem must be addressed in real neuronal systems, so that results in artificial systems could throw light on real systems. We show that there are efficient patterns of connectivity and that these patterns are effective in models with either spiking or non-spiking neurons. This suggests that there may be some underlying general principles governing good connectivity in such networks. We also show that the clustering of the network, measured by Clustering Coefficient, has a strong negative linear correlation to the performance of associative memory. This result is important since a purely static measure of network connectivity appears to determine an important dynamic property of the network.
Neurocomputing | 2009
Rod Adams; Lee Calcraft; Neil Davey
We investigate sparse networks of threshold units, trained with the perceptron learning rule to act as associative memories. The units have position and are placed in a ring so that the wiring cost is a meaningful measure. A genetic algorithm is used to evolve networks that have efficient wiring, but also good functionality. It is shown that this is possible, and that the connection strategy used by the networks appears to maintain some connectivity at all distances, but with the probability of a connection decreasing rapidly with distance.
international symposium on neural networks | 2008
Weiliang Chen; Rod Adams; Lee Calcraft; Volker Steuber; Neil Davey
This paper investigates the relationship between network connectivity and associative memory performance using high capacity associative memory models with different types of sparse networks. We found that the clustering of the network, measured by Clustering Coefficient and Local Efficiency, have a strong linear correlation to its performance as an associative memory. This result is important since a purely static measure of network connectivity appears to determine an important dynamic property of the network.
Information Systems | 2006
Lee Calcraft; Rod Adams; Neil Davey
The performance of sparsely-connected associative memories built from sets of perceptrons configured in a ring structure is investigated using different patterns of connectivity. Architectures based on uniform and linear distributions of restricted maximum connection length are compared to those based on Gaussian distributions and to networks created by progressively rewiring a locally-connected network. It is found that while all four architectures are capable of good pattern-completion performance in sparse networks, the Gaussian, restricted-linear and restricted-uniform architectures require lower mean wiring lengths to achieve the same results. It is shown that in order to achieve good pattern-completion at low wiring costs, connectivity should be localized, though not completely local, and that distal connections are not necessary
international conference on adaptive and natural computing algorithms | 2009
Weiliang Chen; Reinoud Maex; Rod Adams; Volker Steuber; Lee Calcraft; Neil Davey
The problem we address in this paper is that of finding effective and parsimonious patterns of connectivity in sparse associative memories. This problem must be addressed in real neuronal systems, so results in artificial systems could throw light on real systems. We show that there are efficient patterns of connectivity and that these patterns are effective in models with either spiking or non-spiking neurons. This suggests that there may be some underlying general principles governing good connectivity in such networks.
BioSystems | 2008
Lee Calcraft; Rod Adams; Neil Davey
This study examines the performance of sparsely connected associative memory models built using a number of different connection strategies, applied to one- and two-dimensional topologies. Efficient patterns of connectivity are identified which yield high performance at relatively low wiring costs in both topologies. Networks with displaced connectivity are seen to perform particularly well. It is found that two-dimensional models are more tolerant of variations in connection strategy than their one-dimensional counterparts; though networks built with both topologies become less so as their connection density is decreased.
international symposium on neural networks | 2005
Rod Adams; Lee Calcraft; Neil Davey
We investigate sparse networks of threshold units, trained with the perceptron learning rule to act as associative memories. The units have position and are placed in a ring so that the wiring cost is a meaningful measure. A genetic algorithm is used to evolve networks that have efficient wiring, but also good functionality. It is shown that this is possible, and that the connection strategy used by the networks appears to maintain connectivity at all distances, but with the probability of a connection decreasing linearly with distance.
Archive | 2005
Neil Davey; Lee Calcraft; Bruce Christianson; Rod Adams
In this paper we report experiments designed to find the relationship between the different parameters of sparsely connected networks of perceptrons with small world connectivity patterns, acting as associative memories.