Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andrew Blyth is active.

Publication


Featured researches published by Andrew Blyth.


Computers & Security | 2016

A review of cyber security risk assessment methods for SCADA systems

Yulia Cherdantseva; Peter Burnap; Andrew Blyth; Peter Eden; Kevin Jones; Hugh Soulsby; Kristan Stoddart

This paper reviews the state of the art in cyber security risk assessment of Supervisory Control and Data Acquisition (SCADA) systems. We select and in-detail examine twenty-four risk assessment methods developed for or applied in the context of a SCADA system. We describe the essence of the methods and then analyse them in terms of aim; application domain; the stages of risk management addressed; key risk management concepts covered; impact measurement; sources of probabilistic data; evaluation and tool support. Based on the analysis, we suggest an intuitive scheme for the categorisation of cyber security risk assessment methods for SCADA systems. We also outline five research challenges facing the domain and point out the approaches that might be taken.


Operating Systems Review | 2008

Acquiring volatile operating system data tools and techniques

Iain Sutherland; Jon Evans; Theodore Tryfonas; Andrew Blyth

The current approach to forensic examination during search and seizure has predominantly been to pull the plug on the suspect machine and subsequently perform a post mortem examination on the storage medium. However, with the advent of larger capacities of memory, drive encryption and anti-forensics, this procedure may result in the loss of valuable evidence. Volatile data may be vital in determining criminal activity; it may contain passwords used for encryption, indications of anti-forensic techniques, memory resident malware which would otherwise go unnoticed by the investigator. This paper emphasizes the importance of understanding the potential value of volatile data and how best to collate forensic artifacts to the benefit of the investigation, ensuring the preservation and integrity of the evidence. The paper will review current methods for volatile data collection, assessing the capabilities, limitations and liabilities of current tools and techniques available to the forensic investigator.


Computers & Security | 2006

An empirical examination of the reverse engineering process for binary files

Iain Sutherland; George E. Kalb; Andrew Blyth; Gaius Mulley

Reverse engineering of binary code file has become increasingly easier to perform. The binary reverse engineering and subsequent software exploitation activities represent a significant threat to the intellectual property content of commercially supplied software products. Protection technologies integrated within the software products offer a viable solution towards deterring the software exploitation threat. However, the absence of metrics, measures, and models to characterize the software exploitation process prevents execution of quantitative assessments to define the extent of protection technology suitable for application to a particular software product. This paper examines a framework for collecting reverse engineering measurements, the execution of a reverse engineering experiment, and the analysis of the findings to determine the primary factors that affect the software exploitation process. The results of this research form a foundation for the specification of metrics, gathering of additional measurements, and development of predictive models to characterize the software exploitation process.


Journal of Computer Security | 2004

Cost effective management frameworks for intrusion detection systems

Charles Iheagwara; Andrew Blyth; Mukesh Singhal

This paper discusses the financial benefit of intrusion detection systems (IDS) deployment techniques and addresses the problems of bridging the gap between technical security solutions and the business need for it. This is an area of interest to both the research and the business community; most IDSes balance host and network monitoring, but the decision about how to adjust usage of each technique tends to be made in a rather ad-hoc way, or based upon effectiveness of detection only without regard to cost of technique. In practice, selections based on how well a strategy helps a company to perform are preferable and methodologies supporting a selection process of this type will assist an Information Technology officer to explain security mechanism selections more effectively to CEOs. In this context, the approach we propose could be applied when choosing one intrusion detection system over another based on which has a better or higher return on investment for the company.Through a case study, we illustrate the benefits of a better IDS management that leads to a positive Return on Investment (ROI) for IDS deployment. We conceive strategies and approaches to support effective decision-making about which techniques are appropriate for the cost effective management of the IDS in a given environment. It is our intent that this research will serve as a foundation for the formal description of cost structures, analysis, and selection of effective implementation approaches to support the management of IDS deployments.


Computer Networks | 2002

Evaluation of the performance of ID systems in a switched and distributed environment: the real secure case study

Charles Iheagwara; Andrew Blyth

With the phenomenal increase of unwarranted Internet traffic into corporate networks the need for the development and effective use of currently available intrusion detection (ID) systems has acquired great importance. Compounding this is the constantly evolving techniques by professional hackers to defeat any and every counter measure designed to stem or at least contain their acts.In this paper, we present the results of tests conducted to assess the effectiveness of intrusion detection system in a switched and distributed network environment. The results reveal that the performance of ID systems is a function of various factors including network topology, deployment techniques, and network throughput, bandwidth and network traffic conditions.Within the limits of our studies, the findings can be summarized as: 1. The detection capability of the ID system diminishes with increase in bandwidth utilization with the obvious implication that better performance could be achieved with the use of multiple sensors. 2. Deployment at network or domain entry points i.e. outside decoy provides better performance results by up to 11%. 3. Deployment with packet loss limiting devices produces a better result than deployment with the port mirroring technique by up to 27%.


Information & Software Technology | 2004

Cost effective management frameworks: the impact of IDS deployment technique on threat mitigation

Charles Iheagwara; Andrew Blyth; Timm Kevin; David Kinn

Abstract In this paper we measure the financial benefit of an intrusion detection system (IDS) deployment. To this end, we use a standard risk analysis framework and extend it by introducing the Cascading Threat Multiplier (CTM). The idea behind the CTM is that a security compromise incurs two types of costs: (a) The direct cost of lost integrity/confidentiality/availability, and (b) the indirect cost, of the compromised component serving as a potential stepping stone for future attacks. The CTM tries to capture the second type of costs, which are typically ignored in the classic risk analysis framework. We propose new risk analysis formulas that tie the CTM concept into accurate calculation of Return on Investment (ROI), otherwise commonly known as Return on Security Investment. Finally, through a case study we demonstrate the effect of IDS deployment techniques on threat mitigation and the ROI. The result of the case can be used to support effective decision-making about which techniques are appropriate for the cost effective management of the IDS in a given environment.


Journal in Computer Virology | 2011

Malware and steganography in hard disk firmware

Iain Sutherland; Gareth E. Davies; Andrew Blyth

The hard disk drive remains the most commonly used form of storage media in both commercial and domestic computer systems. These drives can contain a vast range of data both of personal value and commercial significance. This paper focuses on two key areas; the potential for the drive operation to be impacted by malicious software and the possibility for the drive firmware to be manipulated to enable a form of steganography. Hard drive firmware is required for the correct operation of the disk drive in particular for dealing with errors arising due to natural wear as the drive ages. Where an area of the drive becomes unreliable due to wear and tear, the disk firmware which monitors the reliability of data access will copy the data from the failing area to a specially designated reserved area. The firmware remaps this data shift so the old data area and the original copy of the data are no longer accessible by the computer operating system. There are now a small number of commercially available devices, intended for data recovery, which can be used to modify the hard drive firmware components. This functionality can be used to conceal code on the disk drive, either as a form of steganography or to potentially include malicious code with the intention to infect or damage software or possibly system hardware. This paper discusses the potential problem generated by firmware being manipulated for malicious purposes.


Information Security Technical Report | 2003

An XML-based architecture to perform data integration and data unification in vulnerability assessments

Andrew Blyth

Abstract One of the problems facing penetration testers is that a test can generate vast quantities of information that need to be stored, analysed and cross-referenced for later use. Consequently, this paper will present an architecture based on the encoding of information within an XML document. We will also demonstrate how, through application of the architecture, large quantities of security-related information can be captured within a single database schema. This database can then be used to ensure that systems are conforming to an organisations network security policy.


Information Security Technical Report | 2003

Measuring vulnerabilities and their exploitation cycle

Evangelos Morakis; Stylianos Vidalis; Andrew Blyth

Abstract In a world ruled by chaotic causality, Heisenbergs uncertainty principle is only a natural limitation. Analysts only have their personal logic, experience and intuition to depend on in order to make judgments regarding the safety of a system. However, todays analysts are getting bombarded with large amounts of data coming from all kinds of security-related products, such as vulnerability scanners, anti-viruses, firewalls etc, causing information overload and data congestion. Thus, the question remains: How can analysts make a correct judgment regarding the vulnerabilities from which a system is suffering, especially when all the ammunition he/she possesses can not deal with such a complex, ever-changing environment? To this end, we believe that structuring knowledge/information regarding a specific domain in an object-oriented hierarchy tree, and providing a formal model to reason and construct possible scenarios of attacks, will provide an analyst with the necessary ammunition.


ACM Siggroup Bulletin | 1997

Business process re-engineering: What is it?

Andrew Blyth

Welcome to what will be I hope the first of many columns on Business Process Re-Engineering. The role of the advisor is to stimulate discussion, provoke debate, inform people, and provide a medium through which people can communicate with one another. What I am planning to do in the coming issues is to offer people the opportunity to comment on what I and others have written. So, if you have a comment or opinion that you would like to share with others, or if you have a review or critique of a book or someone elses work that you would like to share with everyone then please send it to me.

Collaboration


Dive into the Andrew Blyth's collaboration.

Top Co-Authors

Avatar

Iain Sutherland

University of South Wales

View shared research outputs
Top Co-Authors

Avatar

Konstantinos Xynos

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

Peter Eden

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Huw Read

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge