Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Steve R. Gunn is active.

Publication


Featured researches published by Steve R. Gunn.


Journal of Cerebral Blood Flow and Metabolism | 2001

Positron Emission Tomography Compartmental Models

Roger N. Gunn; Steve R. Gunn; Vincent J. Cunningham

The current article presents theory for compartmental models used in positron emission tomography (PET). Both plasma input models and reference tissue input models are considered. General theory is derived and the systems are characterized in terms of their impulse response functions. The theory shows that the macro parameters of the system may be determined simply from the coefficients of the impulse response functions. These results are discussed in the context of radioligand binding studies. It is shown that binding potential is simply related to the integral of the impulse response functions for all plasma and reference tissue input models currently used in PET. This article also introduces a general compartmental description for the behavior of the tracer in blood, which then allows for the blood volume-induced bias in reference tissue input models to be assessed.


Journal of Cerebral Blood Flow and Metabolism | 2002

Positron emission tomography compartmental models: a basis pursuit strategy for kinetic modeling.

Roger N. Gunn; Steve R. Gunn; Federico Turkheimer; John A. D. Aston; Vincent J. Cunningham

A kinetic modeling approach for the quantification of in vivo tracer studies with dynamic positron emission tomography (PET) is presented. The approach is based on a general compartmental description of the tracers fate in vivo and determines a parsimonious model consistent with the measured data. The technique involves the determination of a sparse selection of kinetic basis functions from an overcomplete dictionary using the method of basis pursuit denoising. This enables the characterization of the systems impulse response function from which values of the systems macro parameters can be estimated. These parameter estimates can be obtained from a region of interest analysis or as parametric images from a voxel-based analysis. In addition, model order estimates are returned that correspond to the number of compartments in the estimated compartmental model. Validation studies evaluate the methods performance against two preexisting data led techniques, namely, graphical analysis and spectral analysis. Application of this technique to measured PET data is demonstrated using [11C]diprenorphine (opiate receptor) and [11C]WAY-100635 (5-HT1A receptor). Although the method is presented in the context of PET neuroreceptor binding studies, it has general applicability to the quantification of PET/SPECT radiotracer studies in neurology, oncology, and cardiology.


Ecological Modelling | 1999

Support vector machines for optimal classification and spectral unmixing

Martin Brown; Steve R. Gunn; Hugh G. Lewis

Mixture modelling is becoming an increasingly important tool in the remote sensing community as researchers attempt to resolve the sub-pixel, mixture information, which arises from the overlapping land cover types within the pixel’s instantaneous field of view. This paper describes an approach based on a relatively new technique, support vector machines (SVMs), and contrasts this with more established algorithms such as linear spectral mixture models (LSMM) and artificial neural networks (ANN). In the simplest case, it is shown that the mixture regions formed by the linear support vector machine and the linear spectral mixture model are equivalent; however, the support vector machine automatically selects the relevant pure pixels. When non-linear algorithms are considered it can be shown that the non-linear support vector machines have model spaces which contain many of the conventional neural networks, multi-layer perceptrons and radial basis functions. However, the non-linear support vector machines automatically determine the relevant set of basis functions (nodes) from the performance constraints specified via the loss function and in doing so select only the data points which are important for making a decision. In practice, it has been found that only about 5% of the training exemplars are used to form the decision boundary region, which represents a considerable compression of the data and also means that validation effort can be concentrated on just those important data points.


Machine Learning | 2002

A Probabilistic Framework for SVM Regression and Error Bar Estimation

Junbin Gao; Steve R. Gunn; Chris J. Harris; Martin Brown

In this paper, we elaborate on the well-known relationship between Gaussian Processes (GP) and Support Vector Machines (SVM) under some convex assumptions for the loss functions. This paper concentrates on the derivation of the evidence and error bar approximation for regression problems. An error bar formula is derived based on the ∈-insensitive loss function.


Machine Learning | 2002

Structural Modelling with Sparse Kernels

Steve R. Gunn; J.S. Kandola

A widely acknowledged drawback of many statistical modelling techniques, commonly used in machine learning, is that the resulting model is extremely difficult to interpret. A number of new concepts and algorithms have been introduced by researchers to address this problem. They focus primarily on determining which inputs are relevant in predicting the output. This work describes a transparent, advanced non-linear modelling approach that enables the constructed predictive models to be visualised, allowing model validation and assisting in interpretation. The technique combines the representational advantage of a sparse ANOVA decomposition, with the good generalisation ability of a kernel machine. It achieves this by employing two forms of regularisation: a 1-norm based structural regulariser to enforce transparency, and a 2-norm based regulariser to control smoothness. The resulting model structure can be visualised showing the overall effects of different inputs, their interactions, and the strength of the interactions. The robustness of the technique is illustrated using a range of both artifical and “real world” datasets. The performance is compared to other modelling techniques, and it is shown to exhibit competitive generalisation performance together with improved interpretability.


intelligent data analysis | 1997

Network Performance Assessment for Neurofuzzy Data Modelling

Steve R. Gunn; Martin Brown; K.M. Bossley

This paper evaluates the performance of ten significance measures applied to the problem of determining an appropriate network structure, for data modelling with neurofuzzy systems. The advantages of Neurofuzzy systems are demonstrated with application to both real and synthetic data interpretation problems.


Pattern Recognition | 1999

On the discrete representation of the Laplacian of Gaussian

Steve R. Gunn

The Laplacian of Gaussian (LoG) is commonly employed as a second-order edge detector in image processing, and it is popular because of its attractive scaling properties. However, its application within a finite sampled domain is non-trivial due to its infinite extent. Heuristics are often employed to determine the required mask size and they may lead to poor edge detection and location. We derive an explicit relationship between the size of the LoG mask and the probability of edge detection error introduced by its approximation, providing a strong basis for its stable implementation. In addition, we demonstrate the need for bias correction, to correct the offset error introduced by truncation, and derive strict bounds on the scales that may be employed by consideration of the aliasing error introduced by sampling. To characterise edges, a zero-crossing detector is proposed which uses a bilinear surface to guarantee detection and closure of edges. These issues are confirmed by experimental results, which particularly emphasise the importance of bias correction. As such, we give a new basis for implementation of the LoG edge detector and show the advantages that such analysis can confer.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2003

Handwritten Chinese radical recognition using nonlinear active shape models

Daming Shi; Steve R. Gunn; Robert I. Damper

Handwritten Chinese characters can be recognized by first extracting the basic shapes (radicals) of which they are composed. Radicals are described by nonlinear active shape models and optimal parameters found using the chamfer distance transform and a dynamic tunneling algorithm. The radical recognition rate is 96.5 percent correct (writer-independent) on 280,000 characters containing 98 radical classes.


IEEE Transactions on Neural Networks | 2001

The relevance vector machine technique for channel equalization application

Sheng Chen; Steve R. Gunn; Chris J. Harris

The relevance vector machine (RVM) technique is applied to communication channel equalization. It is demonstrated that the RVM equalizer can closely match the optimal performance of the Bayesian equalizer, with a much sparser kernel representation than that is achievable by the state-of-art support vector machine (SVM) technique.


Pancreatology | 2006

Machine Learning Can Improve Prediction of Severity in Acute Pancreatitis using Admission Values of APACHE II Score and C-Reactive Protein

C.B. Pearce; Steve R. Gunn; A. Ahmed; C. D. Johnson

Background: Acute pancreatitis (AP) has a variable course. Accurate early prediction of severity is essential to direct clinical care. Current assessment tools are inaccurate, and unable to adapt to new parameters. None of the current systems uses C-reactive protein (CRP). Modern machine-learning tools can address these issues. Methods: 370 patients admitted with AP in a 5-year period were retrospectively assessed; after exclusions, 265 patients were studied. First recorded values for physical examination and blood tests, aetiology, severity and complications were recorded. A kernel logistic regression model was used to remove redundant features, and identify the relationships between relevant features and outcome. Bootstrapping was used to make the best use of data and obtain confidence estimates on the parameters of the model. Results: A model containing 8 variables (age, CRP, respiratory rate, pO2 on air, arterial pH, serum creatinine, white cell count and GCS) predicted a severe attack with an area under the receiver-operating characteristic curve (AUC) of 0.82 (SD 0.01). The optimum cut-off value for predicting severity gave sensitivity and specificity of 0.87 and 0.71 respectively. The predictions were significantly better (p = 0.0036) than admission APACHE II scores in the same patients (AUC 0.74) and better than historical admission APACHE II data (AUC 0.68–0.75). Conclusions: This system for the first time combines admission values of selected components of APACHE II and CRP for prediction of severe AP. The score is simple to use, and is more accurate than admission APACHE II alone. It is adaptable and would allow incorporation of new predictive factors.

Collaboration


Dive into the Steve R. Gunn's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris J. Harris

University of Southampton

View shared research outputs
Top Co-Authors

Avatar

P.A.S. Reed

University of Southampton

View shared research outputs
Top Co-Authors

Avatar

Mark S. Nixon

University of Southampton

View shared research outputs
Top Co-Authors

Avatar

Daming Shi

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar

Martin Brown

University of Manchester

View shared research outputs
Top Co-Authors

Avatar

Baofeng Guo

University of Southampton

View shared research outputs
Top Co-Authors

Avatar

Chris Lovell

University of Southampton

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge