Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John Cocke is active.

Publication


Featured researches published by John Cocke.


IEEE Transactions on Information Theory | 1974

Optimal decoding of linear codes for minimizing symbol error rate (Corresp.)

Lalit R. Bahl; John Cocke; Frederick Jelinek; Josef Raviv

The general problem of estimating the a posteriori probabilities of the states and transitions of a Markov source observed through a discrete memoryless channel is considered. The decoding of linear block and convolutional codes to minimize symbol error probability is shown to be a special case of this problem. An optimal decoding algorithm is derived.


Computer Languages | 1989

A methodology for the real world

Gregory J. Chaitin; Marc A. Auslander; Ashok K. Chandra; John Cocke; Martin Edward Hopkins; Peter Willy Markstein

Register allocation may be viewed as a graph coloring problem. Each node in the graph stands for a computed quantity that resides in a machine register, and two nodes are connected by an edge if the quantities interfere with each other, that is, if they are simultaneously live at some point in the object program. This approach, though mentioned in the literature, was never implemented before. Preliminary results of an experimental implementation in a PL/I optimizing compiler suggest that global register allocation approaching that of hand-coded assembly language may be attainable.


Communications of The ACM | 1976

A program data flow analysis procedure

Frances E. Allen; John Cocke

The global data relationships in a program can be exposed and codified by the static analysis methods described in this paper. A procedure is given which determines all the definitions which can possibly “reach” each node of the control flow graph of the program and all the definitions that are “live” on each edge of the graph. The procedure uses an “interval” ordered edge listing data structure and handles reducible and irreducible graphs indistinguishably.


international conference on computational linguistics | 1988

A statistical approach to language translation

Peter F. Brown; John Cocke; S. Della Pietra; V. Della Pietra; Frederick Jelinek; Robert Leroy Mercer; Paul S. Roossin

An approach to automatic translation is outlined that utilizes techniques of statistical information extraction from large data bases. The method is based on the availability of pairs of large corresponding texts that are translations of each other. In our case, the texts are in English and French.Fundamental to the technique is a complex glossary of correspondence of fixed locutions. The steps of the proposed translation process are: (1) Partition the source text into a set of fixed locutions. (2) Use the glossary plus contextual information to select the corresponding set of fixed locutions into a sequence forming the target sentence. (3) Arrange the words of the target fixed locutions into a sequence forming the target sentence.We have developed statistical techniques facilitating both the automatic creation of the glossary, and the performance of the three translation steps, all on the basis of an alignment of corresponding sentences in the two texts.While we are not yet able to provide examples of French / English translation, we present some encouraging intermediate results concerning glossary creation and the arrangement of target word sequences.


Sigplan Notices | 1970

Global common subexpression elimination

John Cocke

When considering compiler optimization, there are two questions that immediately come to mind; one, why and to what extent is optimization necessary and two, to what extent is it possible. When considering the second question, one might immediately become discouraged since it is well known that the program equivalency problem is recursively unsolvable. It is, of course, clear from this that there will never be techniques for generating a completely optimum program. These unsolvability results, however, do not preclude the possibility of ad hoc techniques for program improvement or even a partial theory which produces a class of equivalent programs optimized in varying degrees. The reasons why optimization is required seem to me to fall in two major categories. The first I will call “local” and the second “global”.


Journal of Parallel and Distributed Computing | 1988

Estimating interlock and improving balance for pipelined architectures

David Callahan; John Cocke; Ken Kennedy

Abstract Pipelining is now a standard technique for increasing the speed of computers, particularly for floating-point arithmetic. Single-chip, pipelined floating-point functional units are available as “off the shelf” components. Addressing arithmetic can be done concurrently with floating-point operations to construct a fast processor that can exploit fine-grain parallelism. This paper describes a metric to estimate the optimal execution time of DO loops on particular processors. This metric is parameterized by the memory bandwidth and peak floating-point rate of the processor, as well as the length of the pipelines used in the functional units. Data dependence analysis provides information about the execution order constraints of the operations in the DO loop and is used to estimate the amount of pipeline interlock required by a loop. Several transformations are investigated to determine their impact on loops under this metric.


compiler construction | 1982

Optimization of range checking

Victoria Markstein; John Cocke; Peter Willy Markstein

An analysis is given for optimizing run-time range checks in regions of high execution frequency. These optimizations are accomplished using strength reduction, code motion and common subexpression elimination. Test programs, using the above optimizations, are used to illustrate run-time improvements.


Communications of The ACM | 1977

An algorithm for reduction of operator strength

John Cocke; Ken Kennedy

A simple algorithm which uses an indexed temporary table to perform reduction of operator strength in strongly connected regions is presented. Several extensions, including linear function test replacement, are discussed. These algorithms should fit well into an integrated package of local optimization algorithms.


Archive | 1991

A Probabilistic Parsing Method for Sentence Disambiguation

T. Fujisaki; Frederick Jelinek; John Cocke; Ezra Black; T. Nishino

Constructing a grammar which can parse sentences selected from a natural language corpus is a difficult task. One of the most serious problems is the unmanageably large number of ambiguities. Pure syntactic analysis based only on syntactic knowledge will sometimes result in hundreds of ambiguous parses.


IEEE Computer | 1991

Computer architecture in the 1990s

Harold S. Stone; John Cocke

Some of the technology that will drive the advances of the 1990s are explored. A brief tutorial is given to explain the fundamental speed limits of metal interconnections. The advantages and disadvantages of optical interconnections and where they may be used are discussed in some detail. Trends in speeding up performance by increasing data-path width and by increasing the number of operations performed are reviewed, and questions of efficiency are examined. The advent of super reliable machines produced at very low cost by replicating entire processors is examined.<<ETX>>

Collaboration


Dive into the John Cocke's collaboration.

Researchain Logo
Decentralizing Knowledge