Richard A. Frost
University of Windsor
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Richard A. Frost.
IEEE Communications Surveys and Tutorials | 2012
Shushan Zhao; Akshai Aggarwal; Richard A. Frost; Xiaole Bai
Security in mobile ad-hoc networks (MANETs) continues to attract attention after years of research. Recent advances in identity-based cryptography (IBC) sheds light on this problem and has become popular as a solution base. We present a comprehensive picture and capture the state of the art of IBC security applications in MANETs based on a survey of publications on this topic since the emergence of IBC in 2001. In this paper, we also share insights into open research problems and point out interesting future directions in this area.
The Computer Journal | 1982
Richard A. Frost
Any database, no matter how complex, can be represented as a set of binary-relationships. Consequently, a structure which can store such binary-relationships is logically sufficient as the storage mechanism for a general purpose database management system. For certain applications, the advantages of using such a structure would appear to outweigh the disadvantages. Surprisingly, however, very few systems have been built which use a binary-relational storage structure. The main reason would appear to be the difficulty of implementing such structures efficiently.
practical aspects of declarative languages | 2008
Richard A. Frost; Rahmatullah Hafiz; Paul Callaghan
Parser combinators are higher-order functions used to build parsers as executable specifications of grammars. Some existing implementations are only able to handle limited ambiguity, some have exponential time and/or space complexity for ambiguous input, most cannot accommodate left-recursive grammars. This paper describes combinators, implemented in Haskell, which overcome all of these limitations.
The Computer Journal | 1992
Richard A. Frost
Attributes grammars provide a formal yet intuitive notation for specifying the static semantics of programming languages and consequently have been used in various compiler generation systems. Their use, however, need not be limited to this. With a little change in perspective, many programs may be regarded as interpreters and constructed as executable attributable grammars. This major advantage is that the resulting modular declarative structure facilitates various aspects of the software development process
international workshop/conference on parsing technologies | 2007
Richard A. Frost; Rahmatullah Hafiz; Paul Callaghan
In functional and logic programming, parsers can be built as modular executable specifications of grammars, using parser combinators and definite clause grammars respectively. These techniques are based on top-down backtracking search. Commonly used implementations are inefficient for ambiguous languages, cannot accommodate left-recursive grammars, and require exponential space to represent parse trees for highly ambiguous input. Memoization is known to improve efficiency, and work by other researchers has had some success in accommodating left recursion. This paper combines aspects of previous approaches and presents a method by which parsers can be built as modular and efficient executable specifications of ambiguous grammars containing unconstrained left recursion.
ACM Computing Surveys | 2006
Richard A. Frost
The construction of natural language interfaces to computers continues to be a major challenge. The need for such interfaces is growing now that speech recognition technology is becoming more readily available, and people cannot speak those computer-oriented formal languages that are frequently used to interact with computer applications. Much of the research related to the design and implementation of natural language interfaces has involved the use of high-level declarative programming languages. This is to be expected as the task is extremely difficult, involving syntactic and semantic analysis of potentially ambiguous input. The use of LISP and Prolog in this area is well documented. However, research involving the relatively new lazy functional programming paradigm is less well known. This paper provides a comprehensive survey of that research.
Sigplan Notices | 2006
Richard A. Frost; Rahmatullah Hafiz
Top-down backtracking language processors are highly modular, can handle ambiguity, and are easy to implement with clear and maintainable code. However, a widely-held, and incorrect, view is that top-down processors are inherently exponential for ambiguous grammars and cannot accommodate left-recursive productions. It has been known for many years that exponential complexity can be avoided by memoization, and that left-recursive productions can be accommodated through a variety of techniques. However, until now, memoization and techniques for handling left recursion have either been presented independently, or else attempts at their integration have compromised modularity and clarity of the code.
Science of Computer Programming | 1996
Richard A. Frost; Barbara Szydlowski
Abstract Language processors may be implemented directly as functions. In a programming language that supports higher-order functions, large processors can be built by combining smaller components using higher-order functions corresponding to alternation and sequencing in the BNF notation of the grammar of the language to be processed. If the higher-order functions are defined to implement a top-down backtracking parsing strategy, the processors are modular and, owing to the fact that they resemble BNF notation, are easy to understand and modify. A major disadvantage of this approach is that the resulting processors have exponential time and space complexity in the worst case owing to the reapplication of processors during backtracking. This paper shows how a technique called memoization can be used to improve the efficiency of such processors whilst preserving their modularity. We show that memoized functional recognizers constructed for arbitrary non-left-recursive grammars have O(n3) complexity where n is the length of the input to be processed. The paper also shows how the initial processors could have been memoized using a monadic approach, and discusses the advantages of reengineering the definitions in this way.
canadian conference on artificial intelligence | 2003
Richard A. Frost
Memoization is a well-known method which makes use of a table of previously-computed results in order to ensure that parts of a search (or computation)s pace are not revisited. A new technique is presented which enables the systematic and selective memoization of a wide range of algorithms. The technique overcomes disadvantages of previous approaches. In particular, the proposed technique can help programmers avoid mistakes that can result in sub-optimal use of memoization. In addition, the resulting memoized programs are amenable to analysis using equational reasoning. It is anticipated that further work will lead to proof of correctness of the proposed memoization technique.
canadian conference on artificial intelligence | 2001
Rabih Neouchi; Ahmed Y. Tawfik; Richard A. Frost
This article presents a method for analyzing the evolution of concepts represented by concept lattices in a time stamped database, showing how the concepts that evolve with time induce a change in the concept lattice. The purpose of this work is to extend formal concept analysis to handle temporal properties and represent temporally evolving attributes.