Michael J. Denham
Plymouth University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michael J. Denham.
Hippocampus | 2000
Michael J. Denham; Roman Borisyuk
Recent experimental observations have disclosed the existence of a septal‐hippocampal feedback circuit, composed of medial septum diagonal band of Broca (ms‐dbB) GABAergic projections to the inhibitory interneurons of the hippocampus, and hippocampal GABAergic projections to the ms‐dbB, the major targets of which are the GABAergic septo‐hippocampal projection cells. We propose that this feedback circuit provides the mechanism for the rhythmic suppression of interneuronal activity in the hippocampus, which is observed as low‐level GABAergic‐mediated theta activity. We also propose that this circuit may be the mechanism by which ascending brain stem pathways to the ms‐dbB, in particular from the reticular formation, can influence hippocampal information processing in response to particular behavioral states, by exercising control over the level and frequency of theta activity in the hippocampus. In support of these proposals, we describe a minimal computational model of the feedback circuit which uses a set of four coupled differential equations describing the average dynamic activity of the populations of excitatory and inhibitory cells involved in the circuit. We demonstrate through simulations the inherently robust 4–6‐Hz oscillatory dynamics of the circuit, and show that manipulation of internal connection strengths and external modulatory influences on this circuit changes the dynamics in a way which closely mimics corresponding manipulations in recent neurophysiological experiments investigating theta activity. Hippocampus 2000;10:698–716.
Information Processing and Management | 2006
Rohana K. Rajapakse; Michael J. Denham
This paper reports our experimental investigation into the use of more realistic concepts as opposed to simple keywords for document retrieval, and reinforcement learning for improving document representations to help the retrieval of useful documents for relevant queries. The framework used for achieving this was based on the theory of Formal Concept Analysis (FCA) and Lattice Theory. Features or concepts of each document (and query), formulated according to FCA, are represented in a separate concept lattice and are weighted separately with respect to the individual documents they present. The document retrieval process is viewed as a continuous conversation between queries and documents, during which documents are allowed to learn a set of significant concepts to help their retrieval. The learning strategy used was based on relevance feedback information that makes the similarity of relevant documents stronger and non-relevant documents weaker. Test results obtained on the Cranfield collection show a significant increase in average precisions as the system learns from experience.
Network: Computation In Neural Systems | 2001
Roman Borisyuk; Michael J. Denham; Frank C. Hoppensteadt; Yakov B. Kazanovich; Olga I. Vinogradova
A model of novelty detection is developed which is based on an oscillatory mechanism of memory formation and information processing. The frequency encoding of the input information and adaptation of natural frequencies of network oscillators to the frequency of the input signal are used as the mechanism of information storage. The resonance amplification of network activity is used as a recognition principle for familiar stimuli. Application of the model to novelty detection in the hippocampus is discussed.
BioSystems | 2000
Roman Borisyuk; Michael J. Denham; Frank C. Hoppensteadt; Yakov B. Kazanovich; Olga I. Vinogradova
A model of sparse distributed memory is developed that is based on phase relations between the incoming signals and an oscillatory mechanism for information processing. This includes phase-frequency encoding of input information, natural frequency adaptation among the network oscillators for storage of input signals, and a resonance amplification mechanism that responds to familiar stimuli. Simulations of this model show different types of dynamics in response to new and familiar stimuli. The application of the model to hippocampal working memory is discussed.
Biological Cybernetics | 2007
Nada Yousif; Michael J. Denham
The influence of cortical feedback on thalamic visual responses has been a source of much discussion in recent years. In this study we examine the possible role of cortical feedback in shaping the spatiotemporal receptive field (STRF) responses of thalamocortical (TC) cells in the lateral geniculate nucleus (LGN) of the thalamus. A population-based computational model of the thalamocortical network is used to generate a representation of the STRF response of LGN TC cells within the corticothalamic feedback circuit. The cortical feedback is shown to have little influence on the spatial response properties of the STRF organization. However, the model suggests that cortical feedback may play a key role in modifying the experimentally observed biphasic temporal response property of the STRF, that is, the reversal over time of the polarity of ON and OFF responses of the centre and surround of the receptive field, in particular accounting for the experimentally observed mismatch between retinal cells and TC cells in respect of the magnitude of the second (rebound) phase of the temporal response. The model results also show that this mismatch may result from an anti-phase corticothalamic feedback mechanism.
Neural Computation | 2005
Rohana K. Rajapakse; Michael J. Denham
Bidirectional associative memories (BAMs) are shown to be capable of precisely learning concept lattice structures by Radim Blohlvek. The focus of this letter is to show that the BAM, when set up with a concept lattice by setting up connection weights according to the rule proposed by Blohlvek, always returns the most specific or most generic concept containing the given set of objects or attributes when a set of objects or attributes is presented as input to the object or attribute layer. A proof of this property is given here, together with an example, and a brief application of the property is provided.
European Journal of Neuroscience | 2005
Nada Yousif; Michael J. Denham
The thalamocortical network is modelled using the Wilson–Cowan equations for neuronal population activity. We show that this population model with biologically derived parameters possesses intrinsic nonlinear oscillatory dynamics, and that the frequency of oscillation lies within the spindle range. Spindle oscillations are an early sleep oscillation characterized by high‐frequency bursts of action potentials followed by a period of quiescence, at a frequency of 7–14 Hz. Spindles are generally regarded as being generated by intrathalamic circuitry, as decorticated thalamic slices and the isolated thalamic reticular nucleus exhibit spindles. However, the role of cortical feedback has been shown to regulate and synchronize the oscillation. Previous modelling studies have mainly used conductance‐based models and hence the mechanism relied upon the inclusion of ionic currents, particularly the T‐type calcium current. Here we demonstrate that spindle‐frequency oscillatory activity can also arise from the nonlinear dynamics of the thalamocortical circuit, and we use bifurcation analysis to examine the robustness of this oscillation in terms of the functional range of the parameters used in the model. The results suggest that the thalamocortical circuit has intrinsic nonlinear population dynamics which are capable of providing robust support for oscillatory activity within the frequency range of spindle oscillations.
Reviews in The Neurosciences | 1999
Roman Borisyuk; Michael J. Denham; Susan L. Denham; Frank C. Hoppensteadt
We discuss the role of the hippocampus in information processing in the brain and hypothesise that the hippocampus monitors the stability of sensory cues it receives from the external world, using the current context to predict the next sensory event in the episodic sequence by learning from experience, and memorising these sequences of sensory events. Two computational models are presented here. The predictive theory and model are closely related to experimental evidence and use dynamic synapses with an asymmetric learning rule to develop predictive neural activity of a leaky integrate-and-fire model of a pyramidal CA3 cell. The oscillatory model of the hippocampus for memorising sequences of sensory events is developed as a chain of interacting neural oscillators forced by oscillatory inputs from the entorhinal cortex and from the medial septum.
Biological Cybernetics | 2009
Kameliya Dimova; Michael J. Denham
In this study, we describe a model of motion integration in smooth eye pursuit based on a recursive Bayesian estimation process, which displays a dynamic behaviour qualitatively similar to the dynamics of the motion integration process observed experimentally, both psychophysically in humans and monkeys, and physiologically in monkeys. By formulating the model as an approximate version of a Kalman filter algorithm, we have been able to show that it can be put into a neurally plausible, distributed recurrent form which coarsely corresponds to the recurrent circuitry of visual cortical areas V1 and MT. The model thus provides further support for the notion that the motion integration process is based on a form of Bayesian estimation, as has been suggested by many psychophysical studies, and moreover suggests that the observed dynamic properties of this process are the result of the recursive nature of the motion estimation.
Lecture Notes in Computer Science | 2001
Michael J. Denham
In the biological neural network, synaptic connections and their modification by Hebbian forms of associative: learning have been shown in recent years to have quite complex dynamic characteristics. As yet, these dynamic forms of connection and learning have had little impact on the design of computational neural networks. It is clear however that for the processing of various forms of information, in which the temporal nature of the data is important, eg in temporal sequence learning and in contextual learning, such dynamic characteristics may play an important role. In this paper we review the neuroscientific evidence for the dynamic characteristics of learning and memory, and propose a novel computational associative learning rule which takes account of this evidence. We show that the application of this learning rule allows us to mimic in a computationally simple way certain characteristics of the biological learning process. In particular we show that the learning rule displays similar temporal asymmetry effects which result in either long term potentiation or depression in the biological synapse.