Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Claude E. Shannon is active.

Publication


Featured researches published by Claude E. Shannon.


Physics Today | 1950

The Mathematical Theory of Communication

Claude E. Shannon; Warren Weaver; Norbert Wiener

HE recent development of various methods of modulation such as PCM and PPM which exchange bandwidth for signal-to-noise ratio has intensified the interest in a general theory of communication. A basis for such a theory is contained in the important papers of Nyquist1 and Hartley2 on this subject. In the present paper we will extend the theory to include a number of new factors, in particular the effect of noise in the channel, and the savings possible due to the statistical structure of the original message and due to the nature of the final destination of the information. The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design. If the number of messages in the set is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely. As was pointed out by Hartley the most natural choice is the logarithmic function. Although this definition must be generalized considerably when we consider the influence of the statistics of the message and when we have a continuous range of messages, we will in all cases use an essentially logarithmic measure. The logarithmic measure is more convenient for various reasons:


Transactions of The American Institute of Electrical Engineers | 1938

A symbolic analysis of relay and switching circuits

Claude E. Shannon

In the control and protective circuits of complex electrical systems it is frequently necessary to make intricate interconnections of relay contacts and switches. Examples of these circuits occur in automatic telephone exchanges, industrial motor-control equipment, and in almost any circuits designed to perform complex operations automatically. In this article a mathematical analysis of certain of the properties of such networks will be made. Particular attention will be given to the problem of network synthesis. Given certain characteristics, it is required to find a circuit incorporating these characteristics. The solution of this type of problem is not unique and methods of finding those particular circuits requiring the least number of relay contacts and switch blades will be studied. Methods will also be described for finding any number of circuits equivalent to a given circuit in all operating characteristics. It will be shown that several of the well-known theorems on impedance networks have roughly analogous theorems in relay circuits. Notable among these are the delta-wye (δ-Y) and star-mesh transformations, and the duality theorem.


Journal of The Franklin Institute-engineering and Applied Mathematics | 1956

Reliable circuits using less reliable relays

Edward F. Moore; Claude E. Shannon

Abstract An investigation is made of relays whose reliability can be described in simple terms by means of probabilities. It is shown that by using a sufficiently large number of these relays in the proper manner, circuits can be built which are arbitrarily reliable, regardless of how unreliable the original relays are. Various properties of these circuits are elucidated.


Ibm Journal of Research and Development | 1958

Channels with side information at the transmitter

Claude E. Shannon

In certain communication systems where information is to be transmitted from one point to another, additional side information is available at the transmitting point. This side information relates to the state of the transmission channel and can be used to aid in the coding and transmission of information. In this paper a type of channel with side information is studied and its capacity determined.


Information & Computation | 1967

Lower bounds to error probability for coding on discrete memoryless channels. II.

Claude E. Shannon; Robert G. Gallager; Elwyn R. Berlekamp

New lower bounds are presented for the minimum error probability that can be achieved through the use of block coding on noisy discrete memoryless channels. Like previous upper bounds, these lower bounds decrease exponentially with the block length N. The coefficient of N in the exponent is a convex function of the rate. From a certain rate of transmission up to channel capacity, the exponents of the upper and lower bounds coincide. Below this particular rate, the exponents of the upper and lower bounds differ, although they approach the same limit as the rate approaches zero. Examples are given and various incidental results and techniques relating to coding theory are developed. The paper is presented in two parts: the first, appearing here, summarizes the major results and treats the case of high transmission rates in detail; the second, to appear in the subsequent issue, treats the case of low transmission rates.


IEEE Transactions on Information Theory | 1956

A note on the maximum flow through a network

Peter Elias; Amiel Feinstein; Claude E. Shannon

This note discusses the problem of maximizing the rate of flow from one terminal to another, through a network which consists of a number of branches, each of which has a limited capacity. The main result is a theorem: The maximum possible flow from left to right through a network is equal to the minimum value among all simple cut-sets. This theorem is applied to solve a more general problem, in which a number of input nodes and a number of output nodes are used.


Proceedings of the IRE | 1948

The Philosophy of PCM

B.M. Oliver; John R. Pierce; Claude E. Shannon

Recent papers describe experiments in transmitting speech by PCM (pulse code modulation). This paper shows in a general way some of the advantages of PCM, and distinguishes between what can be achieved with PCM and with other broadband systems, such as large-index FM. The intent is to explain the various points simply, rather than to elaborate them in detail. The paper is for those who want to find out about PCM rather than for those who want to design a system. Many important factors will arise in the design of a system which are not considered in this paper.


Information & Computation | 1957

Certain results in coding theory for noisy channels

Claude E. Shannon

In this paper we will develop certain extensions and refinements of coding theory for noisy communication channels. First, a refinement of the argument based on “random” coding will be used to obtain an upper bound on the probability of error for an optimal code in the memoryless finite discrete channel. Next, an equation is obtained for the capacity of a finite state channel when the state can be calculated at both transmitting and receiving terminals. An analysis is also made of the more complex case where the state is calculable at the transmitting point but not necessarily at the receiving point.


Proceedings of the IRE | 1953

Computers and Automata

Claude E. Shannon

This paper reviews briefly some of the recent developments in the field of automata and nonnumerical computation. A number of typical machines are described, including logic machines, game-playing machines and learning machines. Some theoretical questions and developments are discussed, such as a comparison of computers and the brain, Turings formulation of computing machines and von Neumanns models of self-reproducing machines.


Scientific American | 1950

A Chess-Playing Machine

Claude E. Shannon

For centuries philosophers and scientists have speculated about whether or not the human brain is essentially a machine. Could a machine be designed that would be capable of “thinking”? During the past decade several large-scale electronic computing machines have been constructed which are capable of something very close to the reasoning process. These new computers were designed primarily to carry out purely numerical calculations. They perform automatically a long sequence of additions, multiplications, and other arithmetic operations at a rate of thousands per second. The basic design of these machines is so general and flexible, however, that they can be adapted to work symbolically with elements representing words, propositions, or other conceptual entities.

Collaboration


Dive into the Claude E. Shannon's collaboration.

Top Co-Authors

Avatar

Warren Weaver

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Vincent W. S. Chan

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Irwin Jacobs

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Joan

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Norbert Wiener

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Peter Elias

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Robert G. Gallager

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge