Des Watson
University of Sussex
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Des Watson.
IEEE Transactions on Biomedical Engineering | 2008
Omar S. Al-Kadi; Des Watson
This paper presents the potential for fractal analysis of time sequence contrast-enhanced (CE) computed tomography (CT) images to differentiate between aggressive and nonaggressive malignant lung tumors (i.e., high and low metabolic tumors). The aim is to enhance CT tumor staging prediction accuracy through identifying malignant aggressiveness of lung tumors. As branching of blood vessels can be considered a fractal process, the research examines vascularized tumor regions that exhibit strong fractal characteristics. The analysis is performed after injecting 15 patients with a contrast agent and transforming at least 11 time sequence CE CT images from each patient to the fractal dimension and determining corresponding lacunarity. The fractal texture features were averaged over the tumor region and quantitative classification showed up to 83.3% accuracy in distinction between advanced (aggressive) and early-stage (nonaggressive) malignant tumors. Also, it showed strong correlation with corresponding lung tumor stage and standardized tumor uptake value of fluoro deoxyglucose as determined by positron emission tomography. These results indicate that fractal analysis of time sequence CE CT images of malignant lung tumors could provide additional information about likely tumor aggression that could potentially impact on clinical management decisions in choosing the appropriate treatment procedure.
NMR in Biomedicine | 1998
Anne Rosemary Tate; John R. Griffiths; Irene Martínez-Pérez; Angel Moreno; Ignasi Barba; Miquel E. Cabañas; Des Watson; Juli Alonso; F. Bartumeus; F. Isamat; I. Ferrer; F. Vila; E. Ferrer; Antoni Capdevila; Carles Arús
Recent studies have shown that MRS can substantially improve the non‐invasive categorization of human brain tumours. However, in order for MRS to be used routinely by clinicians, it will be necessary to develop reliable automated classification methods that can be fully validated. This paper is in two parts: the first part reviews the progress that has been made towards this goal, together with the problems that are involved in the design of automated methods to process and classify the spectra. The second part describes the development of a simple prototype system for classifying 1H single voxel spectra, obtained at an echo time (TE) of 135 ms, of the four most common types of brain tumour (meningioma (MM), astrocytic (AST), oligodendroglioma (OD) and metastasis (ME)) and cysts. This system was developed in two stages: firstly, an initial database of spectra was used to develop a prototype classifier, based on a linear discriminant analysis (LDA) of selected data points. Secondly, this classifier was tested on an independent test set of 15 newly acquired spectra, and the system was refined on the basis of these results. The system correctly classified all the non‐astrocytic tumours. However, the results for the the astrocytic group were poorer (between 55 and 100%, depending on the binary comparison). Approximately 50% of high grade astrocytoma (glioblastoma) spectra in our data base showed very little lipid signal, which may account for thepoorer results for this class. Consequently, for the refined system, the astrocytomas were subdivided into two subgroups for comparison against other tumour classes: those with high lipid content and those without.
languages compilers and tools for embedded systems | 2002
Matt Newsome; Des Watson
Interest in Java implementations for resource-constrained environments such as embedded systems has been tempered by concerns regarding its efficiency. Current native compilers for Java offer dramatic increases in efficiency, but have poor support for dynamically-loaded classes, which are typically served by slow interpreters or JIT compilers, the code-size of this latter utterly mismatching the resource constraints of the system.After a brief survey of Ahead-of-Time compilers for Java, we present MoJo --- a new native compiler and the testbed for our proxy compilation scheme, which allows embedded clients to connect to servers and delegate compilation of Java class packages to native code libraries.We also present initial results from experimental testing using MoJo in a resource-constrained, mobile computing environment. We show that MoJo is faster than all surveyed Java implementations for the test platform executing our initial test application. Our proxy compilation scheme results in a 94% speed increase over the fastest tested interpreter system and a 20% speed increase over the fastest tested JIT system.The MoJo-generated binaries for the application are also shown to be 45 times smaller than those required by its nearest iPAQ JRE competitor and 275 times smaller than the Sun JRE v1.3.1 for iPAQ as a direct result of our incremental, on-demand transfer of API classes to the client.
European Journal of Internal Medicine | 2002
P.S Wong; G.K Davidsson; J Timeyin; A Warren; Des Watson; R Vincent; Chris Davidson
BACKGROUND: Two separate cohorts of consecutive patients admitted to hospital with a primary diagnosis of heart failure were studied, the first in 1986 in Rochdale, and the second in 1995 in Brighton. METHODS: We observed the clinical profile, treatment and mortality during hospital admission and reviewed their status at 6 months. There were 132 patients in the Rochdale cohort and 223 in the Brighton cohort. RESULTS: The Rochdale cohort was characterised by a lower mean age and longer hospital stay. Significant differences were also observed in co-morbidity and the use of ACE inhibitors, but hospital mortality was almost identical (25% in Rochdale and 24% in Brighton). A low systolic blood pressure, hyponatraemia, hyperkalaemia and a raised blood urea at presentation were independent adverse prognostic factors. In contrast, prior treatment with ACE inhibitors in patients with congestive cardiac failure led to a more favourable hospital outcome. Age, gender and co-morbidity did not affect mortality apart from patients with acute myocardial infarction. Follow-up of these cohorts showed that mortality of the two groups remained high at 180 days after admission (40% in Rochdale and 39% in Brighton). There were marked differences in the use of ACE inhibitors in survivors, but target doses of ACE inhibitors (enalapril 20 mg/day or equivalent) were only achieved in 31%, despite direct communication between the hospital and primary care physicians. CONCLUSIONS: Although clinical and treatment profiles differed between the two periods studied, the hospital and 6-month mortality of patients with heart failure remained high. More emphasis needs to be given to optimising ACE inhibitor use in primary care.
ACM Computing Surveys | 2013
James Stanier; Des Watson
Compilers commonly translate an input program into an intermediate representation (IR) before optimizing it and generating code. Over time there have been a number of different approaches to designing and implementing IRs. Different IRs have varying benefits and drawbacks. In this survey, we highlight key developments in the area of IR for imperative compilers, group them by a taxonomy and timeline, and comment on the divide between academic research and real-world compiler technology. We conclude that mainstream compilers, especially in the multicore era, could benefit from further IR innovations.
Software - Practice and Experience | 2012
James Stanier; Des Watson
Compilers use a variety of techniques to optimize and transform loops. However, many of these optimizations do not work when the loop is irreducible. Node splitting techniques transform irreducible loops into reducible loops, but many real‐world compilers choose to leave them unoptimized. This article describes an empirical study of irreducibility in current versions of open‐source software, and then compares them with older versions. We also study machine‐generated C code from a number of software tools. We find that irreducibility is extremely rare, and is becoming less common with time. We conclude that leaving irreducible loops unoptimized is a perfectly feasible future‐proof option due to the rarity of its occurrence in non‐trivial software. Copyright
bioinformatics and bioengineering | 2008
Omar S. Al-Kadi; Des Watson
Five different texture methods are used to investigate their susceptibility to subtle noise occurring in lung tumor Computed Tomography (CT) images caused by acquisition and reconstruction deficiencies. Noise of Gaussian and Rayleigh distributions with varying mean and variance was encountered in the analyzed CT images. Fisher and Bhattacharyya distance measures were used to differentiate between an original extracted lung tumor region of interest (ROI) with a filtered and noisy reconstructed versions. Through examining the texture characteristics of the lung tumor areas by five different texture measures, it was determined that the autocovariance measure was least affected and the gray level co-occurrence matrix was the most affected by noise. Depending on the selected ROI size, it was concluded that the number of extracted features from each texture measure increases susceptibility to noise.
compiler construction | 2004
Tim Owen; Des Watson
Language implementations that use a uniform pointer representation for generic datatypes typically apply boxing operations to convert non-uniform objects into the required form. The cost of this boxing process impacts upon the performance of programs that make heavy use of genericity with non-uniform data such as integers and other primitive value types. We show that the overhead of boxing objects into heap storage can be significantly reduced by taking a lazy approach: allowing pointers to stack-allocated objects that are only copied to the heap when necessary. Delaying the boxing of objects avoids unnecessary heap allocation, and results in speedups of around 25% for a range of test programs.
Archive | 1995
Rosemary Tate; Des Watson; Stephen J. Eglen
Traditional methods for quantifying magnetic resonance spectra which rely on identifying and quantifying peaks in individual spectra often prove problematic and unsuccessful when the data is acquiredin vivo. A different approach is reported in which pattern recognition techniques were used to classify successfully a set of 75 spectra (according to the subject’s dietary group) without the need for identifying or measuring the peaks. A discrete wavelet transform was performed on each spectrum and combinations of the first 64 wavelet coefficients were used as the features for classification.
ieee international workshop on system on chip for real time applications | 2002
Tim Owen; Julian Rathke; Ian Wakeman; Des Watson
In an environment where devices and appliances with computational power are connected together, controlling the behaviour of programs that run in this network becomes important. Furthermore, the management of multiple programs executing on many devices needs to be kept under control, to ensure the safety and robustness of the wider system. We propose a programming language approach to handling this complexity, controlling the behaviour of programs using policy specifications.