Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Timothy J. Hickey is active.

Publication


Featured researches published by Timothy J. Hickey.


Journal of the ACM | 2001

Interval arithmetic: From principles to implementation

Timothy J. Hickey; Q. Ju; M. H. van Emden

We start with a mathematical definition of a real interval as a closed, connected set of reals. Interval arithmetic operations (addition, subtraction, multiplication, and division) are likewise defined mathematically and we provide algorithms for computing these operations assuming exact real arithmetic. Next, we define interval arithmetic operations on intervals with IEEE 754 floating point endpoints to be sound and optimal approximations of the real interval operations and we show that the IEEE standards specification of operations involving the signed infinities, signed zeros, and the exact/inexact flag are such as to make a correct and optimal implementation more efficient. From the resulting theorems, we derive data that are sufficiently detailed to convert directly to a program for efficiently implementing the interval operations. Finally, we extend these results to the case of general intervals, which are defined as connected sets of reals that are not necessarily closed.


Journal of the ACM | 1979

Two Algorithms for Determining Volumes of Convex Polyhedra

Jacques Cohen; Timothy J. Hickey

Determining volumes of convex n-dimensional polyhedra defined by a linear system of inequalities is useful in program analysis Two methods for computing these volumes are proposed (1) summing the volumes of stmphces which form the polyhedron, and (2) summing the volumes of (increasingly smaller) paralleleplpeds which can be fit into the polyhedron Assuming that roundoff errors are small, the first method is analytically exact whereas the second one converges to the exact solution at the expense of addmonal computer time Examples of polyhedra whose volumes were computed by programs representing the algorithms are also provided


ACM Transactions on Programming Languages and Systems | 1987

Parsing and compiling using Prolog

Jacques Cohen; Timothy J. Hickey

This paper presents the material needed for exposing the reader to the advantages of using Prolog as a language for describing succinctly most of the algorithms needed in prototyping and implementing compilers or producing tools that facilitate this task. The available published material on the subject describes one particular approach in implementing compilers using Prolog. It consists of coupling actions to recursive descent parsers to produce syntax-trees which are subsequently utilized in guiding the generation of assembly language code. Although this remains a worthwhile approach, there is a host of possibilities for Prolog usage in compiler construction. The primary aim of this paper is to demonstrate the use of Prolog in parsing and compiling. A second, but equally important, goal of this paper is to show that Prolog is a labor-saving tool in prototyping and implementing many non-numerical algorithms which arise in compiling, and whose description using Prolog is not available in the literature. The paper discusses the use of unification and nondeterminism in compiler writing as well as means to bypass these (costly) features when they are deemed unnecessary. Topics covered include bottom-up and top-down parsers, syntax-directed translation, grammar properties, parser generation, code generation, and optimizations. Newly proposed features that are useful in compiler construction are also discussed. A knowledge of Prolog is assumed.


SIAM Journal on Computing | 1983

Uniform Random Generation of Strings in a Context-Free Language

Timothy J. Hickey; Jacques Cohen

Let S be the set of all strings of length n generated by a given context-free grammar. A uniform random generator is one which produces strings from S with equal probability. In generating these strings, care must be taken in choosing the disjuncts that form the right-hand side of a grammar rule so that the produced string will have the specified length. Uniform random generators have applications in studying the complexity of parsers, in estimating the average efficiency of theorem provers for the propositional calculus, in establishing a measure of ambiguity of a grammar, etc. Two methods are presented for generating uniform random strings in an unambiguous context-free language. The first method will generate a random string of length n in linear time, but must use a precomputed table of size


Journal of the ACM | 1988

Automating program analysis

Timothy J. Hickey; Jacques Cohen

O(n^{r + 1} )


Journal of Logic Programming | 1989

Global compilation of prolog

Timothy J. Hickey; Shyam Mudambi

, where r is the number of nonterminals in the grammar used to specify the language. The second method precomputes part of the table and calculates the other entries as they are called for. It requi...


international workshop on hybrid systems: computation and control | 2004

Rigorous Modeling of Hybrid Systems using Interval Arithmetic Constraints

Timothy J. Hickey; David K. Wittenberg

The first part of the paper shows that previous theoretical work on the semantics of probabilistic programs (Kozen) and on the correctness of performance annotated programs (Ramshaw) can be used to automate the average-case analysis of simple programs containing assignments, conditionals, and loops. A performance compiler has been developed using this theoretical foundation. The compiler is described, and it is shown that special cases of symbolic simplifications of formulas play a major role in rendering the system usable. The performance compiler generates a system of recurrence equations derived from a given program whose efficiency one wishes to analyze. This generation is always possible, but the problem of solving the resulting equations may be complex. The second part of the paper presents an original method that generalizes the previous approach and is applicable to functional programs that make use of recursion and complex data structures. Several examples are presented, including an analysis of binary tree sort. A key feature of the analysis of such programs is that distributions on complex data structures are represented using attributed probabilistic grammars.


Journal of the ACM | 1982

Upper Bounds for Speedup in Parallel Parsing

Jacques Cohen; Timothy J. Hickey; Joel Katcoff

Abstract Current WAM-type compilers employ incremental compilation, in which each procedure is compiled in isolation from the program as a whole. This approach is ideal for the initial stages of program development, since procedures can be compiled and recompiled very quickly. We have developed global compilation techniques to be used in the final stages of program development. These techniques use data-flow and control-flow information to optimize the intermediate code. Specifically, the optimizations involve using inferred mode information to generate indexing code which intermixes unification instructions, primitive test instructions, and switching instructions. One of the primary goals of this research is to develop global compilation techniques which eliminate the need for the user to insert cuts in the program to improve performance. Empirical results show that these optimizationscan result in significant time and space savings.


symposium on principles of programming languages | 2000

Analytic constraint solving and interval arithmetic

Timothy J. Hickey

We provide a rigorous approach to modeling, simulating, and analyzing hybrid systems using CLP(F) (Constraint Logic Programming (Functions)) [14], a system which combines CLP (Constraint Language Programming) [21] with interval arithmetic [30]. We have implemented this system, and provide timing information. Because hybrid systems are often used to prove safety properties, it is critical to have a rigorous analysis. By using intervals throughout the system, we make it easier to include measurement errors in our models and to prove safety properties.


practical aspects of declarative languages | 2000

CLIP: A CLP(Intervals) Dialect for Metalevel Constraint Solving

Timothy J. Hickey

ABSTV, ACT. A method is presented for computing upper bounds for the speedup gamed by synchronousmultiprocessor, bottom-up, no-backtrack parsing of strings generated by a context-free grammar. First the maxunum speedup s is defined using an optimal strategy for a certain pebble game Then analyuc upper and lower bounds for s are developed These bounds are computed for parsing trees with certain propertms. The computation consists of (1) determining a subclass of strings of a language which are favorably parsed in parallel, (2) estabhshing sets of difference equations and solving them to estimate the minimum time needed to parse these strings; and (3) computing the speedup function. The method is applied to the parsing of arithmetic expressions and strings of a stmple programming language. The results presented show how maxtmum speedup gums vary with the number of processors avadable.

Collaboration


Dive into the Timothy J. Hickey's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge