Neil Davey
University of Hertfordshire
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Neil Davey.
Journal of Intelligent and Robotic Systems | 2001
Ray J. Frank; Neil Davey; Stephen P. Hunt
Neural Network approaches to time series prediction are briefly discussed, and the need to find the appropriate sample rate and an appropriately sized input window identified. Relevant theoretical results from dynamic systems theory are briefly introduced, and heuristics for finding the appropriate sampling rate and embedding dimension, and thence window size, are discussed. The method is applied to several time series and the resulting generalisation performance of the trained feed-forward neural network predictors is analysed. It is shown that the heuristics can provide useful information in defining the appropriate network architecture.
Annals of Neurology | 2015
Lieke Kros; Oscar H.J. Eelkman Rooda; Jochen K. Spanke; Parimala Alva; Marijn N. van Dongen; Athanasios Karapatis; Else A. Tolner; Christos Strydis; Neil Davey; Beerend H. J. Winkelman; Mario Negrello; Wouter A. Serdijn; Volker Steuber; Arn M. J. M. van den Maagdenberg; Chris I. De Zeeuw; Freek E. Hoebeek
Disrupting thalamocortical activity patterns has proven to be a promising approach to stop generalized spike‐and‐wave discharges (GSWDs) characteristic of absence seizures. Here, we investigated to what extent modulation of neuronal firing in cerebellar nuclei (CN), which are anatomically in an advantageous position to disrupt cortical oscillations through their innervation of a wide variety of thalamic nuclei, is effective in controlling absence seizures.
international conference on engineering applications of neural networks | 2009
David Gray; David Bowes; Neil Davey; Yi Sun; Bruce Christianson
The automated detection of defective modules within software systems could lead to reduced development costs and more reliable software. In this work the static code metrics for a collection of modules contained within eleven NASA data sets are used with a Support Vector Machine classifier. A rigorous sequence of pre-processing steps were applied to the data prior to classification, including the balancing of both classes (defective or otherwise) and the removal of a large number of repeating instances. The Support Vector Machine in this experiment yields an average accuracy of 70% on previously unseen data.
IET Software | 2012
David Gray; David Bowes; Neil Davey; Yi Sun; Bruce Christianson
Background: The NASA metrics data program (MDP) data sets have been heavily used in software defect prediction research. Aim: To highlight the data quality issues present in these data sets, and the problems that can arise when they are used in a binary classification context. Method: A thorough exploration of all 13 original NASA data sets, followed by various experiments demonstrating the potential impact of duplicate data points when data mining. Conclusions: Firstly researchers need to analyse the data that forms the basis of their findings in the context of how it will be used. Secondly, the bulk of defect prediction experiments based on the NASA MDP data sets may have led to erroneous findings. This is mainly because of repeated/duplicate data points potentially causing substantial amounts of training and testing data to be identical.
Neurocomputing | 2004
Neil Davey; Stephen P. Hunt; Rod Adams
Various algorithms for constructing weight matrices for Hopfield-type associative memories are reviewed, including ones with much higher capacity than the basic model. These alternative algorithms either iteratively approximate the projection weight matrix or use simple perceptron learning.An experimental investigation of the performance of networks trained by these algorithms is presented, including measurements of capacity, training time and their ability to correct corrupted versions of the training patterns.
Neural Networks | 1999
Rod Adams; Kate Butchart; Neil Davey
A new, dynamic, tree structured network, the Competitive Evolutionary Neural Tree (CENT) is introduced. The network is able to provide a hierarchical classification of unlabelled data sets. The main advantage that the CENT offers over other hierarchical competitive networks is its ability to self determine the number, and structure, of the competitive nodes in the network, without the need for externally set parameters. The network produces stable classificatory structures by halting its growth using locally calculated heuristics. The results of network simulations are presented over a range of data sets, including Andersons IRIS data set. The CENT network demonstrates its ability to produce a representative hierarchical structure to classify a broad range of data sets.
Connection Science | 2004
Neil Davey; Rod Adams
High capacity associative neural networks can be built from networks of perceptrons, trained using simple perceptron training. Such networks perform much better than those trained using the standard Hopfield one-shot Hebbian learning. An experimental investigation into how such networks perform when the connection weights are not free to take any value is reported. The three restrictions investigated are: a symmetry constraint, a sign constraint and a dilution constraint. The selection of these constraints is motivated by both engineering and biological considerations.
Journal of Pharmacy and Pharmacology | 2011
Gary P. Moss; Yi Sun; Simon Wilkinson; Neil Davey; Rod Adams; Gary P. Martin; M. Prapopopolou; Marc B. Brown
Objectives Predicting the rate of percutaneous absorption of a drug is an important issue with the increasing use of the skin as a means of moderating and controlling drug delivery. One key feature of this problem domain is that human skin permeability (as Kp) has been shown to be inherently non‐linear when mathematically related to the physicochemical parameters of penetrants. As such, the aims of this study were to apply and evaluate Gaussian process (GP) regression methods to datasets for membranes other than human skin, and to explore how the nature of the dataset may influence its analysis.
Connection Science | 2006
Neil Davey; Lee Calcraft; Rod Adams
Models of associative memory usually have full connectivity or, if diluted, random symmetric connectivity. In contrast, biological neural systems have predominantly local, non-symmetric connectivity. Here we investigate sparse networks of threshold units, trained with the perceptron learning rule. The units are given position and are arranged in a ring. The connectivity graph varies between being local to random via a small world regime, with short path lengths between any two neurons. The connectivity may be symmetric or non-symmetric. The results show that it is the small world networks with non-symmetric weights and non-symmetric connectivity that perform best as associative memories. It is also shown that in highly dilute networks small world architectures will produce efficiently wired associative memories, which still exhibit good pattern completion abilities.
Information Systems | 2004
Samarasena Buchala; Neil Davey; Ray J. Frank; Tim M. Gale
Data in most real world applications are high dimensional and learning algorithms like neural networks have problems in handling high dimensional data. However, the intrinsic dimension is often much less than the original dimension of the data. We use a fractal based method to estimate the intrinsic dimension and show that a nonlinear projection method called curvilinear component analysis can effectively reduce the original dimension to the intrinsic dimension. We apply this approach for dimensionality reduction of the face images data and use neural network classifiers for gender classification.