Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David R. Barstow is active.

Publication


Featured researches published by David R. Barstow.


Artificial Intelligence | 1979

An experiment in knowledge-based automatic programming

David R. Barstow

Abstract Human programmers seem to know a lot about programming. This suggests a way to try to build automatic programming systems: encode this knowledge in some machine-usable form. In order to test the viability of this approach, knowledge about elementary symbolic programming has been codified into a set of about four hundred detailed rules, and a system, called PECOS, has been built for applying these rules to the task of implementing abstract algorithms. The implementation techniques covered by the rules include the representation of mappings as tables, sets of pairs, property list markings, and inverted mappings, as well as several techniques for enumerating the elements of a collection. The generality of the rules is suggested by the variety of domains in which PECOS has successfully implemented abstract algorithms, including simple symbolic programming, sorting, graph theory, and even simple number theory. In each case, PECOSs knowledge of different techniques enabled the construction of several alternative implementations. In addition, the rules can be used to explain such programming tricks as the use of property list markings to perform an intersection of two linked lists in linear time. Extrapolating from PECOs knowledge-based approach and from three other approaches to automatic programming (deductive, transformational, high level language), the future of automatic programming seems to involve a changing role for deduction and a range of positions on the generality-power spectrum.


international conference on software engineering | 1987

Artificial intelligence and software engineering

David R. Barstow

Software Engineering is a knowledge-intensive activity, requiring extensive knowledge of the application domain and of the target software itself. Many Software Engineering costs can be attributed to the ineffectiveness of current techniques for managing this knowledge, and Artificial Intelligence techniques can help alleviate this situation. More than two decades of research have led to many significant theoretical results, but few demonstrations of practical utility. This is due in part to the amount and diversity of knowledge required by Software Engineering activities, and in part to the fact that much of the research has been narrowly focused, missing many issues that are of great practical importance. Important issues that remain to be addressed include the representation and use of domain knowledge and the representation of the design and implementation history of a software system. If solutions to these issues are found, and experiments in practical situations are successful, the implications for the practice of Software Engineering will be profound, and radically different software development paradigms will become possible.


IEEE Transactions on Software Engineering | 1981

The Refinement Paradigm: The Interaction of Coding and Efficiency Knowledge in Program Synthesis

Elaine Kant; David R. Barstow

A refinement paradigm for implementing a high-level specification in a low-level target language is discussed. In this paradigm, coding and analysis knowledge work together to produce an efficient program in the target language. Since there are many possible implementations for a given specification of a program, searching knowledge is applied to increase the efficiency of the process of finding a good implementation. For example, analysis knowledge is applied to determine upper and lower cost bounds on alternate implementations, and these bounds are used to measure the potential impact of different design decisions and to decide which alternatives should be pursued. In this paper we also describe a particular implementation of this program synthesis paradigm, called PSI/SYN, that has automatically implemented a number of programs in the domain of symbolic processing.


Acta Informatica | 1980

Remarks on A Synthesis of Several Sorting Algorithms by John Darlington

David R. Barstow

SummaryIn his paper “A Synthesis of Several Sorting Algorithms,” John Darlington presents syntheses for six different sorting algorithms, together with a family tree of sorting algorithms, and mentions a symmetry between Quick Sort, Selection Sort, Merge Sort, and Insertion Sort. In our own attempts to codify programming knowledge, we have developed a slightly different family tree which shows similar symmetries, and which also shows that Bubble Sort and Sinking Sort can be viewed as in-place versions of Selection Sort and Insertion Sort, thus adding another symmetry to those noted by Darlington.


Sigplan Notices | 1983

Who needs languages, and why do they need them? or no matter how high the level, it's still programming

Stephen W. Smoliar; David R. Barstow

Increased research interest in the software development process is threatening to crowd out the concerns of the end user. Computer science provides an abundance of tools, including specification languages, design languages, special-purpose programming languages, and even wide spectrum languages, capable of accommodating the goals of all the preceding languages in a single, unified package. Unfortunately, as computer scientists become more involved with the software development process, the role of the end user tends to diminish. Throwing languages at a problem domain, either in greater numbers or in great flexibility, does not necessarily properly address the needs of the party who wanted the software in the first place. The problem is that, however noble the intentions of language designers may be, the end user will ultimately confront situations in which the major obstacle is one of mastery of the language, rather than difficulties in the problem domain. As an alternative, we propose that more attention be paid to the environment in which software development takes place than to the languages in which the stages of development are expressed. This talk will discuss environmental facilities which enhance a users syntactic and semantic understanding of his software tools.


international conference on software engineering | 1988

Automatic programming for streams II: transformational implementation

David R. Barstow

For pt.I see ISCAI p.232-237, Los Angeles, USA, (August 1985). Phi NIX is an automatic programming system for writing programs which interact with external devices through temporally-ordered streams of values. Abstract specifications are stated in terms of constrains on the values of input and output streams. The target language is the Stream Machine, a language which includes concurrently executing processes communicating and synchronizing through streams. Phi NIX produces programs by repeatedly transforming abstract specifications through successively more concrete forms until concrete Stream Machine programs are produced. An example which Phi NIX has successfully implemented involves three major steps: transforming the specification into an applicative expression, transforming the applicative expression into three imperative processes, and merging the processes into a single process. Each major step involves several other transformation steps that reformulate and simplify intermediate expressions.<<ETX>>


Intelligence\/sigart Bulletin | 1977

A knowledge base organization for rules about programming

David R. Barstow

PECOS is a knowledge-based system for automatic program synthesis. Programs are specified as abstract algorithms in a high-level language for symbolic computation. Through the successive application of programming rules, the specification is gradually refined into a concrete implementation in the target language. The existence of several rules for the same task permits the construction of a variety of distinct programs from a single initial specification. Internally, program descriptions are represented as collections of nodes, each labeled with a programming concept and with other properties related to that concept. The refinement process is guided by the selection and application of rules about programming. These rules are stated as condition-action pairs, but the identification of certain rule types permits the use of various techniques for efficient rule retrieval and testing, including the determination of retrieval patterns and the automatic separation of the condition into an applicability pattern and a binding pattern.


international conference on software engineering | 1999

Baseball seasons and dog years

David R. Barstow

From 1995 through 1997, Instant Sports used the Internet to provide interactive real-time coverage of Major League Baseball. The changes in Instant Sports core architecture during that time provide some lessons about architectural evolution in the context of rapidly changing technology, including the need to identify fundamental issues rather than trendy ones, the importance of a good domain model, and the role of domain characteristics in balancing computational and communication resources.


Ai Magazine | 1984

Artificial Intelligence at Schlumbergers

David R. Barstow

Schlumberger is a large, multinational corporation concerned primarily with the measurement, collection, and interpretation of data. For the past fifty years, most of the activities have been related to hydrocarbon exploration. The efficient location and production of hydrocarbons from an underground formation requires a great deal of knowledge about the formation, ranging in scale from the size and shape of the rocks pore spaces to the size and shape of the entire reservoir. Schlumberger provides its clients with two types of information : measurements, called logs, of the petrophysical properties of the rock around the borehole, such as its electrical, acoustical, and radioactive characteristics; and in terpretations of these logs in terms of geophysical properties such as porosity and mineral composition. Since log interpretation is expert skill, the emergence of expert systems technology prompted Schlumbergers initial interest in Artificial Intelligence. Our first full- scale attempt at a commercial-quality expert system was the Dipmeter Advisor. Following these initial efforts, Schlumberger has expanded its Artificial Intelligence activities, and is now engaged in both basic and applied research in a wide variety of areas.


Ai Magazine | 1984

A Perspective on Automatic Programming

David R. Barstow

Collaboration


Dive into the David R. Barstow's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Elaine Kant

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Howard E. Shrobe

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Koji Torii

Nara Institute of Science and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge