Andrew P. Lenaghan
Kingston University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andrew P. Lenaghan.
intelligence and security informatics | 2007
Cyril Onwubiko; Andrew P. Lenaghan
The difficulty in managing security threats and vulnerabilities for small and medium-sized enterprises (SME) is investigated. A detailed security conceptual framework, asset and threat classifications are proposed. These models assist SMEs to prevent and effectively mitigate threats and vulnerabilities in assets. The investigated conceptual framework models security issues in terms of owners, vulnerabilities, threat agents, threats, countermeasures, risks and assets, and their relationship; while the asset classification model is a value-based approach, and the threat classification model is based on attack timeline.
conference on computer as a tool | 2005
Cyril Onwubiko; Andrew P. Lenaghan; Luke Hebbes
An enhancement to existing epidemiological worm models is proposed which is used to simulate the spread of aggressive worms within computer networks. The proposed model presents worm propagation dynamics in five state transitions in a finite state machine model. The results obtained from the simulation are used to compare the dependability of previous worm quarantine models
international conference on document analysis and recognition | 2003
Andrew P. Lenaghan; Ron Malyan
Architectures for integrated and distributed handwritingrecognition systems are discussed. An XML (eXtensibleMarkup Language) based representation for onlinehandwriting data, referred to as XPEN, is proposed.XPEN is based on the earlier UNIPEN format. Theflexibility of the new format is illustrated with an exampleof the use of XSLT (XSL-Transformations) to translateXPEN into the Scalable Vector Graphics (SVG) formatfor visualisation and the processing of XPEN using aprogramming language via the Document Object Model(DOM) Application Programming Interface.
conference on computer as a tool | 2005
Luke Hebbes; Ron Malyan; Andrew P. Lenaghan
This paper proposes a scheme for introducing genetic algorithms (GA) into the turbo code structure to enable the systematic data to be discarded at the encoder and reconstructed at the decoder. The scheme enables code rates of frac12 to be achieved without puncturing the parity data. The paper also shows that the speed of convergence of GAs, when implemented in the proposed structure, can be used to reduce the computational overhead involved in the turbo decoder
intelligence and security informatics | 2006
Cyril Onwubiko; Andrew P. Lenaghan
A security defence framework is proposed that offers capabilities for distributed sensing of security threats, centralised analysis and coordinated response. The centralised analysis component of the framework uses graph and evolutionary computing techniques to analyse distributed threats perceived in the network.
Joint IST Workshop on Mobile Future, 2006 and the Symposium on Trends in Communications. SympoTIC '06. | 2006
Cyril Onwubiko; Andrew P. Lenaghan; Luke Hebbes
A distributed security assistance framework is proposed that unifies security mechanisms to provide enterprise-wide security defence to computer network systems. The proposed security framework offers capabilities for distributed threat detection, integrated analysis and coordinated response via security spaces. It also offers extension mechanisms to coordinate human countermeasures in protecting networks. The framework is underpinned on a new security paradigm - sensor, analysis and response
Archive | 2006
Cyril Onwubiko; Andrew P. Lenaghan; Luke Hebbes; Ron Malyan
A graph-based data structure is proposed for security information management systems (SIM) to analyse security event data from varying security sources. The proposed relation information graph-based representation is used to model attack graphs from a simulation-based network environment, which represent security classes in terms of security events and attributes as graph node, and graph edge as temporal relationships. An efficient pattern matching (isomorphism) technique is then utilised to analyse security attack graphs based on matching known security pattern in a database of pattern attack graphs. The graph matching technique decomposes graph data structure into path, and filtering of paths to reduce search space by discarding graphs that do not match.
international conference on information and communication security | 2004
Luke Hebbes; Andrew P. Lenaghan
Content security requires authenticity given by integrity checks, authentication and non-repudiation. This can be achieved by using digital signatures. This paper presents a new semi-fragile steganographic technique for embedding digital signatures in images. This is achieved by using a novel modified Bit-Plane Complexity Segmentation (BPCS) Based Steganography scheme. Semi-fragile implies survival from limited processing, which is achieved by utilising Convolutional coding, a Forward Error Correcting (FEC) channel coding technique, in the embedding.
Electronic Imaging: Science and Technology | 1996
Andrew P. Lenaghan; Ron Malyan
Both cognitive processes and artificial recognition systems may be characterized by the forms of representation they build and manipulate. This paper looks at how handwriting is represented in current recognition systems and the psychological evidence for its representation in the cognitive processes responsible for reading. Empirical psychological work on feature extraction in early visual processing is surveyed to show that a sound psychological basis for feature extraction exists and to describe the features this approach leads to. The first stage of the development of an architecture for a handwriting recognition system which has been strongly influenced by the psychological evidence for the cognitive processes and representations used in early visual processing, is reported. This architecture builds a number of parallel low level feature maps from raw data. These feature maps are thresholded and a region labeling algorithm is used to generate sets of features. Fuzzy logic is used to quantify the uncertainty in the presence of individual features.
International Journal of Electronic Security and Digital Forensics | 2009
Cyril Onwubiko; Andrew P. Lenaghan