Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Allen Goldberg is active.

Publication


Featured researches published by Allen Goldberg.


computer and communications security | 1998

A specification of Java loading and bytecode verification

Allen Goldberg

This paper gives a mathematical specification the Java Virtual Machine (JVM) bytecode verifier. The specification is an axiomatic description of the verifier that makes precise subtle aspects of the JVM semantics and the verifier. We focus on the use of data flow analysis to verify type-correctness and the use of typing contexts to insure global type consistency in the context of an arbitrary strategy for dynamic class loading. The specification types interfaces with sufficient accuracy to eliminate run-time type checks. Our approach is to specify a generic dataflow architecture and formalize the JVM verifier as an instance of this architecture. The emphasis in this paper is on readability of the specification and mathematical clarity. The specification given is consistent with the descriptions in the Lindholm’s and Yellin’s The JavaTM Virtual Machine Specification. It less committed to certain implementation choices than Sun’s version 1.1 implementation. In particular, the specification does not commit an implementation to any loading strategy, and detects all type errors as early as possible.


international symposium on software testing and analysis | 1994

Applications of feasible path analysis to program testing

Allen Goldberg; Tie-Cheng Wang; David Zimmerman

For certain structural testing criteria a significant proportion of tests instances are infeasible in the sense the semantics of the program implies that test data cannot be constructed that meet the test requirement. This paper describes the design and prototype implementation of a structural testing system that uses a theorem prover to determine feasibility of testing requirements and to optimize the number of test cases required to achieve test coverage. Using this approach, we were able to accurately and efficiently determine path feasibility for moderately-sized program units of production code written in a subset of Ada. On these problems, the computer solutions were obtained much faster and with greater accuracy than manual analysis. The paper describes how we formalize test criteria as control flow graph path expressions; how the criteria are mapped to logic formulas; and how we control the complexity of the inference task. It describes the limitations of the system and proposals for its improvement as well as other applications of the analysis.


IEEE Transactions on Software Engineering | 1986

Knowledge-based programming: A survey of program design and construction techniques

Allen Goldberg

An application of artificial intelligence (AI) to the development of software is presented for the construction of efficient implementations of programs from formal high-level specifications. Central to this discussion is the notion of program development by means of program transformation. Using this methodology, a formal specification is compiled (either manually or automatically) into an efficient implementation by the repeated application of correctness-preserving, source-to-source transformations. The author considers techniques for data structure selection, the procedural representation of logic assertions, store-versus-compute, finite differencing, loop fusion, and algorithm design methods presented from the point of view of algorithm design and high-level program optimization.


ACM Sigsoft Software Engineering Notes | 1990

Reusing software developments

Allen Goldberg

Software development environments of the future will be characterized by extensive reuse of previous work. This paper addresses the issue of reusability in the context in which design is achieved by the transformational development of formal specifications into efficient implementations. It explores how an implementation of a modified specification can be realized by replaying the transformational derivation of the original and modifying it as required by changes made to the specification. Our approach is to structure derivations using the notion of tactics, and record derivation histories as an execution trace of the application of tactics . One key idea is that tactics are compositional: higher level tactics are constructed from more rudimentary using defined control primitives. This is similar to the approach used in LCF[12] and NuPRL[1, 8]. Given such a derivation history and a modified specification, the correspondence problem [21, 20] addresses how during replay a correspondence between program parts of the original and modified program is established. Our approach uses a combination of name association, structural properties, and associating components to one another by intensional descriptions of objects defined in the transformations themselves. An implementation of a rudimentary replay mechanism for our interactive development system is described. For example with the system we can first derive a program from a specification that computes some basic statistics such as mean, variance, frequency data, etc. The derivation is about 15 steps; it involves deriving an efficient means of computing frequency data, combining loops and selecting data structures. We can then modify the specification by adding the ability to compute the maximum or mode and replay the steps of the previous derivation.


IEEE Transactions on Software Engineering | 2000

A design methodology for data-parallel applications

Lars S. Nyland; Jan F. Prins; Allen Goldberg; Peter H. Mills

A methodology for the design and development of data-parallel applications and components is presented. Data-parallelism is a well understood form of parallel computation, yet developing simple applications can involve substantial efforts to express the problem in low level notations. We describe a process of software development for data-parallel applications starting from high level specifications, generating repeated refinements of designs to match different architectural models and performance constraints, enabling a development activity with cost benefit analysis. Primary issues are algorithm choice, correctness, and efficiency, followed by data decomposition, load balancing, and message passing coordination. Development of a data-parallel multitarget tracking application is used as a case study, showing the progression from high to low level refinements. We conclude by describing tool support for the process.


international workshop on software specification and design | 1989

Toward reliable reactive systems

L.-M. Gilham; Allen Goldberg; Tie-Cheng Wang

The goal of the work reported here is to construct a system that supports the acquisition and correct implementation of software specifications for reactive systems. The system utilizes a finite state machine formalism derived from the work of Hare]. [5], set-theoretic data structures as exemplified by [I., 81, and relies on both classic verification techniques and consistency-preserving transformational implementation of specifications. Formal reasoning and manipulation of programs is greatly simplified by referentiGJ transparency, which insures that the meaning of a program fragment is not dependent on context or state.. The attractiveness of functional and !logic programming derive from their maintenance of referential transparency. Although suppression of the notion of state makes manipulation of programs easier, it seriously detracts from their expressiveness. In particular, specification of a reactive system is extremely awkward without the notion of state. The challenge addressed in this work is to provide a notion of state and state change in a way that supports


international symposium on software reliability engineering | 1991

A mechanical verifier for supporting the design of reliable reactive systems

Tie-Cheng Wang; Allen Goldberg

An automated verification system, Reacto-Verifier (RVF) developed for supporting the design of reliable reactive systems is described. In order to make the formal verification of large and/or complex systems tractable, RVF is enhanced by a knowledge-base manager, a proof manager, and a dependency maintenance procedure. The knowledge base manager supports a flexible use of a large set of axioms and rules derived from the domain theory of the specification language. The proof manager helps handle verification failure and supports off-line development of proofs. The dependency maintenance procedure permits the user to trace the history of a derivation and supports efficient addition and/or retraction of assumptions. RVF can be used both for batch-style automated verification, and for incremental development of verified programs.<<ETX>>


european conference on parallel processing | 1996

A Refinement Methodology for Developing Data-Parallel Applications

Lars S. Nyland; Jan F. Prins; Allen Goldberg; Peter H. Mills; John H. Reif; Robert A. Wagner

Data-parallelism is a relatively well-understood form of parallel computation, yet developing simple applications can involve substantial efforts to express the problem in low-level data-parallel notations. We describe a process of software development for data-parallel applications starting from high-level specifications, generating repeated refinements of designs to match different architectural models and performance constraints, supporting a development activity with cost-benefit analysis. Primary issues are algorithm choice, correctness and efficiency, followed by data decomposition, load balancing and message-passing coordination. Development of a data-parallel multitarget tracking application is used as a case study, showing the progression from high to low-level refinements. We conclude by describing tool support for the process.


conference on automated deduction | 1994

KITP-93: An Automated Inference System for Program Analysis

Tie-Cheng Wang; Allen Goldberg

Our goal is to produce a powerful inference system capable of dealing with a large number of KB rules, and conjectures with diverse features. To achieve this goal, we have built KITP-93 with a logical framework that allows convenient user interaction and easy incorporation of existing inference techniques. We have developed a management mechanism for supporting controlled use of KB rules, high-level user interaction, and incremental development of proofs. We have designed an inference engine by incorporating a variety of efficient inference techniques, and emphasizing the role of term-rewriting, goal-oriented deduction, and decision procedures, as well as interactive proof utilities. KITP-93 has been incorporated as an inference server by a number of formal environments. Significantly, it has been used successfully by a large industrial user in the control flow analysis of Ada procedures. A review of the use of KITP in solving real world problems is included in [2]. Besides proving theorems, other inference services that KITP-93 provides include disproving a non-theorem, simplifying program fragments, and deducing antecedents.


Information Processing Letters | 1982

Average time analysis of simplified Davis-Putnam procedures☆

Allen Goldberg; Paul Walton Purdom; Cynthia A. Brown

Collaboration


Dive into the Allen Goldberg's collaboration.

Top Co-Authors

Avatar

Jan F. Prins

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Lars S. Nyland

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul Walton Purdom

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel W. Palmer

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

David Zimmerman

Lockheed Missiles and Space Company

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rickard E. Faith

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge