Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gregory J. Chaitin is active.

Publication


Featured researches published by Gregory J. Chaitin.


compiler construction | 1982

Register allocation and spilling via graph coloring

Gregory J. Chaitin

In a previous paper we reported the successful use of graph coloring techniques for doing global register allocation in an experimental PL/I optimizing compiler. When the compiler cannot color the register conflict graph with a number of colors equal to the number of available machine registers, it must add code to spill and reload registers to and from storage. Previously the compiler produced spill code whose quality sometimes left much to be desired, and the ad hoe techniques used took considerable amounts of compile time. We have now discovered how to extend the graph coloring approach so that it naturally solves the spilling problem. Spill decisions are now made on the basis of the register conflict graph and cost estimates of the value of keeping the result of a computation in a register rather than in storage. This new approach produces better object code and takes much less compile time.


Journal of the ACM | 1966

On the Length of Programs for Computing Finite Binary Sequences

Gregory J. Chaitin

The use of Turing machines for calculating finite binary sequences is studied from the point of view of information theory and the theory of recursive functions. Various results are obtained concerning the number of instructions in programs. A modified form of Turing machine is studied from the same point of view. An application to the problem of defining a patternless sequence is proposed in terms of the concepts here developed.


Journal of the ACM | 1975

A Theory of Program Size Formally Identical to Information Theory

Gregory J. Chaitin

A new definition of program-size complexity is made. H(A,B/C,D) is defined to be the size in bits of the shortest self-delimiting program for calculating strings A and B if one is given a minimal-size self-delimiting program for calculating strings C and D. This differs from previous definitions: (1) programs are required to be self-delimiting, i.e. no program is a prefix of another, and (2) instead of being given C and D directly, one is given a program for calculating them that is minimal in size. Unlike previous definitions, this one has precisely the formal properties of the entropy concept of information theory. For example, H(A,B) = H(A) + H(B/A) -~ 0(1). Also, if a program of length k is assigned measure 2 -k, then H(A) = -log2 (the probability that the standard universal computer will calculate A) -{- 0(1).


Computer Languages | 1989

A methodology for the real world

Gregory J. Chaitin; Marc A. Auslander; Ashok K. Chandra; John Cocke; Martin Edward Hopkins; Peter Willy Markstein

Register allocation may be viewed as a graph coloring problem. Each node in the graph stands for a computed quantity that resides in a machine register, and two nodes are connected by an edge if the quantities interfere with each other, that is, if they are simultaneously live at some point in the object program. This approach, though mentioned in the literature, was never implemented before. Preliminary results of an experimental implementation in a PL/I optimizing compiler suggest that global register allocation approaching that of hand-coded assembly language may be attainable.


Journal of the ACM | 1969

On the Length of Programs for Computing Finite Binary Sequences: statistical considerations

Gregory J. Chaitin

An attempt is made to carry out a program (outlined in a previous paper) for defining the concept of a random or patternless, finite binary sequence, and for subsequently defining a random or patternless, infinite binary sequence to be a sequence whose initial segments are all random or patternless finite binary sequences. A definition based on the bounded-transfer Turing machine is given detailed study, but insufficient understanding of this computing machine precludes a complete treatment. A computing machine is introduced which avoids these difficulties.


Journal of the ACM | 1974

Information-Theoretic Limitations of Formal Systems

Gregory J. Chaitin

An attempt is made to apply information-theoretic computational complexity to meta-mathematics. The paper studies the number of bits of instructions that must be given to a computer for it to perform finite and infinite tasks, and also the time it takes the computer to perform these tasks. This is applied to measuring the difficulty of proving a given set of theorems, in terms of the number of bits of axioms that are assumed, and the size of the proofs needed to deduce the theorems from the axioms.


International Journal of Theoretical Physics | 1982

Gödel's Theorem and Information

Gregory J. Chaitin

Gödels theorem may be demonstrated using arguments having an informationtheoretic flavor. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual.


Advances in Applied Mathematics | 1987

Incompleteness theorems for random reals

Gregory J. Chaitin

We obtain some dramatic results using statistical mechanics-thermodynamics kinds of arguments concerning randomness, chaos, unpredictability, and uncertainty in mathematics. We construct an equation involving only whole numbers and addition, multiplication, and exponentiation, with the property that if one varies a parameter and asks whether the number of solutions is finite or infinite, the answer to this question is indistinguishable from the result of independent tosses of a fair coin. This yields a number of powerful Godel incompleteness-type results concerning the limitations of the axiomatic method, in which entropy-information measures are used.


IEEE Transactions on Information Theory | 1974

Information-theoretic computation complexity

Gregory J. Chaitin

This paper attempts to describe, in nontechnical language, some of the concepts and methods of one school of thought regarding computational complexity. It applies the viewpoint of information theory to computers. This will first lead us to a definition of the degree of randomness of individual binary strings, and then to an information-theoretic version of Godels theorem on the limitations of the axiomatic method. Finally, we will examine in the light of these ideas the scientific method and yon Neumanns views on the basic conceptual problems of biology.


Theoretical Computer Science | 1976

Information-theoretic characterizations of recursive infinite strings

Gregory J. Chaitin

Abstract Loveland and Meyer have studied necessary and sufficient conditions for an infinite binary string x to be recursive in terms of the program-size complexity relative to n of its n -bit prefixes x n . Meyer has shown that x is recursive iff ∃ c , ∀ n , K ( x n ⧸ n ) ⩽ c , and Loveland has shown that this is false if one merely stipulates that K ( x n ⧸ n ) ⩽ c for infinitely many n . We strengthen Meyers theorem. From the fact that there are few minimal-size programs for calculating n given result, we obtain a necessary and sufficient condition for x to be recursive in terms of the absolute program-size complexity of its prefixes: x is recursive iff ∃ c , ∀ n , K ( x n ) ⩽ K ( n ) + c . Again Lovelands method shows that this is no longer a sufficient condition for x to be recursive if one merely stipulates that K ( x n ) ⩽ K ( n ) + c for infinitely many n .

Collaboration


Dive into the Gregory J. Chaitin's collaboration.

Researchain Logo
Decentralizing Knowledge