Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen Murrell is active.

Publication


Featured researches published by Stephen Murrell.


decision support systems | 1997

A survey of tools for the validation and verification of knowledge-based systems: 1985–1995

Stephen Murrell; Robert Plant

Abstract This paper presents the findings of a survey of software tools built to assist in the verification and validation of knowledge-based systems. The tools were identified from literature sources from the period 1985–1995. The tool builders were contacted and asked to complete and return a survey that identified which testing and analysis techniques were utilised and covered by their tool. From these survey results it is possible to identify trends in tool development, technique coverage and areas for future research.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1996

On the validation and verification of production systems: a graph reduction approach

Stephen Murrell; Robert Plant

This paper takes a parallel processing approach to the implementation of rule-based systems using a graph-reduction architecture, and investigates the consequences of this architecture in relation to the validation and verification of knowledge-based systems. The paper improves on the traditional sequential approaches to the development of knowledge-based systems and the limited validation and verification techniques that are applicable. This is contrasted with a graph reduction implementation of knowledge-based systems development based on an ALICE-like machine. The advantages of this style of programming in relation to systems development and program correctness are discussed. The paper shows that significant benefits could potentially be achieved through the use of graph-reduction techniques in the development of these systems.


Journal of Systems and Software | 1995

Formal semantics for rule-based systems

Stephen Murrell; Robert Plant

Abstract The existence of a formal description of any language is a prerequisite to any rigorous methods of proof, validation, or verification. This article develops a formal semantics in the denotational style for rule-based production systems. Primarily, a formal description of a generalized, hypothetical language for rule-based expert systems is developed as a base. Building on this base, it is possible to specify in detail the variations that distinguish individual real-world systems in a usable way.


international conference on service systems and service management | 2008

Electronic identification, personal privacy and security in the services sector

Stephen Murrell; Norman G. Einspruch

The continued proliferation of services in the economy depends on being able to make co-production secure in the matter of protecting the privacy and security of customer data/information. The increasing prevalence of electronic means of identification, from magnetic stripe cards through touch cards, GSM SIM cards, passive and active radio frequency identification (RFID) tags, to contact and contactless smart cards, has created a great increase in the efficiency of some security and consumer commercial operations, and an at least equal increase in convenience for both the public and administrators of these systems. However, these technologies are being used in ever more sensitive applications, and give rise to serious concerns over personal safety and privacy, identity theft, commercial security, and even to some extent national security.


international conference on service systems and service management | 2008

Enabling the enabler: Information technology and the services sector

Stephen Murrell; Daniel Berg; Norman G. Einspruch

The services sector, providing eighty percent of domestic employment and GDP, has sustained dramatic growth in size and significance in the last three decades. This growth has in major part been enabled by application of the Information Technology (IT) discipline. Data Surface Mining (DSM) was applied to a recent compilation of 100 major IT companies. It was found that approximately half of these companies, the enablers, are themselves in the Goods Sector. As such, the Goods Sector is a major enabler of the Services Sector.


Archive | 2007

C, C++, C# to Cracking

Robert Plant; Stephen Murrell

Foundation concept: Programming language. Definition: Standard, general-purpose programming languages. Overview The evolution of programming languages is a short but very complex story. By the mid 1970s the design of mainstream programming languages had branched in two directions: the general-purpose programming languages and the systems programming languages. General-purpose languages were strongly oriented towards supporting reliable software development; the structure of the languages corresponded closely to the structure of a good top-down design, providing programmers with a built-in framework for safe and verified construction; they were able to provide useful warnings about some of the most common things that could go wrong in a programs design before it is ever run, and they made many of the most troublesome of programmer errors impossible. Unfortunately, what was good for general software development was not always good for low-level, or systems, programming. Implementors of operating systems, compilers, and hardware-monitoring software sometimes have to perform exactly the kind of low-level machine-oriented operations that are so strongly discouraged by the standards of good software development, and programming languages that enforce those standards are made almost unusable for such purposes. As a result, the systems programming language branch of parallel evolution was also constructed. The essential feature of a systems programming language is that a programmer should be able to give any commands that the computer is capable of obeying, and not have to fight against builtin safety features if those commands do not satisfy some safety standard.


Archive | 2007

An Executive's Guide to Information Technology: Global positioning system to Hypertext, HTML

Robert Plant; Stephen Murrell

Definition: Also known as Navstar GPS, an electronic device that may rapidly and accurately determine its own geographical location by receiving signals from a network of orbiting satellites. Overview There are a few dozen special-purpose satellites, launched and maintained by the US military, in constant Earth orbit. They contain extremely accurate clocks, programmed knowledge of their own orbits (which is constantly corrected by signals from ground stations), and high-frequency radio transmitters. These satellites transmit a constant stream of data, which describes the exact time and their position, and can easily be picked up by terrestrial receivers. A GPS receiver listens to the stream of data from a number of these satellites, and from it, using relatively simple geometry, can calculate its own exact position (longitude, latitude, and elevation) to an accuracy of just a few feet. These receivers can be made very small (around one square inch) and moderately cheaply (tens of dollars). Their primary use is as an aid to navigation (in aeroplanes, in smart guided missiles, and in hand units for use by people), but they may also be embedded into larger systems. By comparing a sequence of positions, a GPS unit may also determine its own speed and direction of movement, and the relevant portion of a digitized map may be automatically displayed.


Knowledge Based Systems | 1995

Graph reduction implementation of a production system

Stephen Murrell; Robert Plant

The paper explores the implementation of rule-based pattern-directed inference systems on parallel computers. The paper discusses one of these approaches in detail, the use of a graph-reduction machine such as ALICE. The technique is illustrated through two example domains: automobile fault diagnosis and organic psychiatric mental disorders. The paper discusses extensions to the graph reduction technique as applied to knowledge-based systems, including partitioning, time considerations and input data types. The paper shows that the graph-reduction technique has significant advantages for knowledge-based system implementation over conventional approaches, and it demonstrates that this programming style is amenable to knowledge engineering domains.


acm southeast regional conference | 1990

Experiments in natural language processing

Stephen Murrell; Robert Plant

OF WORK IN PROGRESS Recent developments in functional programming techniques have made the field of natural language processing more amenable to experimentation. Conventional programming languages, and the programming methods normally associated with them, which due to their familiarity, seem to be satisfactory for the production of natural language applications, do in fact restrict creativity. Their cumbersome syntax and the inflexibility of their control structures and order of execution make the act of programming, which should be nothing more than the transcription of organised thoughts, into a titanic task. Frequently, coding is harder work than design. Functional programming languages [1] are normally very compact, and have little or no redundancy in their syntax. While this could be dangerous for critical systems, it can speed up prototyping by an order of magnitude. Functional languages also may have a different execution strategy, known as Lazy Evaluation [2J, under which programs take on a non-algorithmic nature, which also simplifies the design process. Experiments have shown that a powerful form of parallel processing may be incorporated into a functional system [3]. The ability to perform computationsconcurrently.withoutbeing concerned with inter-process communications or scheduling, greatly simplifies many tasks that are normally complex (e.g., resolution in logic programming, and searching data-structures). Natural language processing is a very complex domain, which has never been conquered. One of the reasons for this (apart, of course from man’s incomplete knowledge of his languages) has been our inability to perform creative experiments at a reasonable rate. Functional programming systems alleviate this inability. It is the aim of our research to explore in detail the applicability of new and existing functional programming techniques to the areas of parsing and understanding complex grammars, and processing semantic networks in the context of a natural language front end to an intelligent data retrieval system. We have already found that these techniques make a significant contribution to development in some of the major areas. As an example, we can implement the pure reasoning features of Prolog in a 20 line program, which took about one hour to produce and debug. This implementation is very easy to read, and thus makes experimentation with the reasoning methods fast and convenient. Our current investigations are centered on the production of a simple unification of the syntactic and semantic processing involved in the understandingofnaturallanguageinput. Preliminary results from experiments with a non-backtracking parser for ambiguous grammars indicate that positive results may be expected and that the techniques of functional programming may successfully be applied to practical, non-trivial


Archive | 2007

An executive's guide to information technology : principles, business models, and terminology

Robert Plant; Stephen Murrell

Collaboration


Dive into the Stephen Murrell's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Berg

Rensselaer Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge