David R. Musser
Rensselaer Polytechnic Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David R. Musser.
Communications of The ACM | 1978
John V. Guttag; Ellis Horowitz; David R. Musser
A data abstraction can be naturally specified using algebraic axioms. The virtue of these axioms is that they permit a representation-independent formal specification of a data type. An example is given which shows how to employ algebraic axioms at successive levels of implementation. The major thrust of the paper is twofold. First, it is shown how the use of algebraic axiomatizations can simplify the process of proving the correctness of an implementation of an abstract data type. Second, semi-automatic tools are described which can be used both to automate such proofs of correctness and to derive an immediate implementation from the axioms. This implementation allows for limited testing of programs at design time, before a conventional implementation is accomplished.
IEEE Transactions on Software Engineering | 1980
David R. Musser
This paper describes the data type definition facilities of the AFFIRM system for program specification and verification. Following an overview of the system, we review the rewrite rule concepts that form the theoretical basis for its data type facilities. The main emphasis is on methods of ensuring convergence (finite and unique termination) of sets of rewrite rules and on the relation of this property to the equational and inductive proof theories of data types.
Artificial Intelligence | 1987
Deepak Kapur; David R. Musser
Abstract Advances of the past decade in methods and computer programs for showing consistency of proof systems based on first-order equations have made it feasible, in some settings, to use proof by consistency as an alternative to conventional rules of inference. Musser described the method applied to proof of properties of inductively defined objects. Refinements of this inductionless induction method were discussed by Kapur, Goguen, Huet and Hullot, Huet and Oppen, Lankford, Dershowitz, Paul, and more recently by Jouannaud and Kounalis as well as by Kapur, Narendran and Zhang. This paper gives a very general account of proof by consistency and inductionless induction, and shows how previous results can be derived simply from the general theory. New results include a theorem giving characterizations of an unambiguity property that is key to applicability of proof by consistency, and a theorem similar to the Birkhoffs Completeness Theorem for equational proof systems, but concerning inductive proof.
Software - Practice and Experience | 1994
David R. Musser; Alexander A. Stepanov
We outline an approach to construction of software libraries in which generic algorithms (algorithmic abstractions) play a more central role than in conventional software library technology or in the object‐oriented programming paradigm. Our approach is to consider algorithms first, decide what types and access operations they need for efficient execution, and regard the types and operations as formal parameters that can be instantiated in many different ways, as long as the actual parameters satisfy the assumptions on which the correctness and efficiency of the algorithms are based. The means by which instantiation is carried out is language dependent; in the C + + examples in this paper, we instantiate generic algorithms by constructing classes that define the needed types and access operations. By use of such compile‐time techniques and careful attention to algorithmic issues, it is possible to construct software components of broad utility with no sacrifice of efficiency.
Journal of Symbolic Computation | 1988
Deepak Kapur; David R. Musser; Paliath Narendran
The Knuth and Bendix test for local confluence of a term rewriting system involves generating superpositions of the left-hand sides, and for each superposition deriving a critical pair of terms and checking whether these terms reduce to the same term. We prove that certain superpositions, which are called composite because they can be split into other superpositions, do not have to be subjected to the critical-pair-joinability test; it suffices to consider only prime superpositions. As a corollary, this result settles a conjecture of Lankford that unblocked superpositions can be omitted. To prove the result, we introduce new concepts and proof techniques which appear useful for other proofs relating to the Church-Rosser property. This test has been implemented in the completion procedures for ordinary term rewriting systems as well as term rewriting systems with associative-commutative operators. Performance of the completion procedures with this test and without the test are compared on a number of examples in the Rewrite Rule Laboratory (RRL) being developed at General Electric Research and Development Center.
SIAM Journal on Computing | 1983
John V. Guttag; Deepak Kapur; David R. Musser
In mechanical theorem proving, particularly in proving properties of algebraically specified data types, we frequently need a decision procedure for the theory of a given finite set of equations (axioms). A general approach to this problem is to try to derive from the axioms a set of rewrite rules that are “canonical,” i.e., they rewrite to a canonical form all terms that are equal (according the axioms and the equivalence and substitution properties of equality). Rewrite rules are canonical if and only if they determine a relation that is both confluent and uniformly terminating. The difficulty of proving uniform termination has been the major drawback of the rewrite rule approach to deciding equations.A new method of proving uniform termination is proposed. Assuming that the rewriting relation is globally finite (for any term there are only finitely many terms to which it can be rewritten), nontermination can occur only if there are cycles. Uniform termination is proved by showing that no cycles can occ...
Proceedings of the 1987 annual ACM SIGAda international conference on Ada | 1987
David R. Musser; Alexander A. Stepanov
It is well-known that data abstractions are crucial to good software engineering practice. We argue that algorithmic abstractions, or generic algorithms, are perhaps even more important for software reusability. Generic algorithms are parameterized procedural schemata that are completely independent of the underlying data representation and are derived from concrete, efficient algorithms. We discuss this notion with illustrations from the structure of an Ada library of reusable software components we are presently developing.
formal methods | 1981
Deepak Kapur; David R. Musser; Alexander A. Stepanov
abstract constructs define new classes of objects. The abstract construct is roughly the inverse of the refine construct. The instantiate, e, -, and infotm serve to include more knowledge about a class of objects, while ~SUZQ.~ is used to represent a class of objects in terms of another class. The instantiate construct records the information that a class of objects can be refined to another class; or, stated another way, a class of objects can be abstracted to another class. The latter could be obtained from the former by reversinq the arguments to instantiate. The use of these constructs is illustrated on structures (see below), a class of objects definable in Tecton, in 151. (Except that abstract was not discussed and refine was called enrich in that paper.) In the next section, we will give examples of some of these constructs; their use is also illustrated in the discussion of the communication network example. We discuss four different types of objects in Tecton which we have found useful in describing different kinds of activities of a complex software system: structures, entities, events, and environ- ments. Some of these types of objects have appeared previously in
international conference on functional programming | 1981
Deepak Kapur; David R. Musser; Alexander A. Stepanov
Operators in functional languages such as APL and FFP are a useful programming concept. However, this concept cannot be fully exploited in these languages because of certain constraints. It is proposed that an operator should be associated with a structure having the algebraic properties on which the operators behavior depends. This is illustrated by introducing a language that provides mechanisms for defining structures and operators on them. Using this language, it is possible to describe algorithms abstractly, thus emphasizing the algebraic properties on which the algorithms depend. The role that formal representation of mathematical knowledge can play in the development of programs is illustrated through an example. An approach for associating complexity measures with a structure and operators is also suggested. This approach is useful in analyzing the complexity of algorithms in an abstract setting.
compiler construction | 2001
Sibylle Schupp; Douglas P. Gregor; David R. Musser; Shin-Ming Liu
For abstract data types (ADTs) there are many potential optimizations of code that current compilers are unable to perform. These optimizations either depend on the functional specification of the computational task performed through an ADT or on the semantics of the objects defined. In either case the abstract properties on which optimizations would have to be based cannot be automatically inferred by the compiler. In this paper our aim is to address this level-of-abstraction barrier by showing how a compiler can be organized so that it can make use of semantic information about an ADT at its natural abstract level, before type lowering, inlining, or other traditional compiler steps obliterate the chance. We present an extended case study of one component of a C++ compiler, the simplifier; discuss the design decisions of a new simplifier (simplifier generator) and its implementation in C++; and give performance measurements. The new simplifier is connected to the Gnu C++ compiler and currently performs optimizations at very high level in the front end. When tested with the Matrix Template Library, a library already highly fine-tuned by hand, we achieved run-time improvements of up to six percent.