Raymond W. Yeung
The Chinese University of Hong Kong
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Raymond W. Yeung.
IEEE Transactions on Information Theory | 2003
Shuo-Yen Robert Li; Raymond W. Yeung; Ning Cai
Consider a communication network in which certain source nodes multicast information to other nodes on the network in the multihop fashion where every node can pass on any of its received data to others. We are interested in how fast each node can receive the complete information, or equivalently, what the information rate arriving at each node is. Allowing a node to encode its received data before passing it on, the question involves optimization of the multicast mechanisms at the nodes. Among the simplest coding schemes is linear coding, which regards a block of data as a vector over a certain base field and allows a node to apply a linear transformation to a vector before passing it on. We formulate this multicast problem and prove that linear coding suffices to achieve the optimum, which is the max-flow from the source to each receiving node.
international symposium on information theory | 2002
Ning Cai; Raymond W. Yeung
Recent work on network coding renders a new view on multicasting in a network. In the paradigm of network coding, the nodes in a network are allowed to encode the information received from the input links. The usual function of switching at a node is a special case of network coding. The advantage of network coding is that the full capacity of the network can be utilized. In this paper, we propose a new model which incorporates network coding and information security. Specifically, a collection of subsets of links is given, and a wiretapper is allowed to access any one (but not more than one) of these subsets without being able to obtain any information about the message transmitted. Our model includes secret sharing as a special case. We present a construction of secure linear network codes provided a certain graph-theoretic sufficient condition is satisfied.
IEEE Transactions on Information Theory | 1999
Raymond W. Yeung; Zhen Zhang
Inspired by mobile satellite communications systems, we consider a source coding system which consists of multiple sources, multiple encoders, and multiple decoders. Each encoder has access to a certain subset of the sources, each decoder has access to certain subset of the encoders, and each decoder reconstructs a certain subset of the sources almost perfectly. The connectivity between the sources and the encoders, the connectivity between the encoders and the decoders, and the reconstruction requirements for the decoders are all arbitrary. Our goal is to characterize the admissible coding rate region. Despite the generality of the problem, we have developed an approach which enables us to study all cases on the same footing. We obtain inner and outer bounds of the admissible coding rate region in terms of /spl Gamma//sub N/* and /spl Gamma/~/sub N/*, respectively, which are fundamental regions in the entropy space defined by Yeung (1991). So far, there has not been a full characterization of /spl Gamma//sub N/*, so these bounds cannot be evaluated explicitly except for some special cases. Nevertheless, we obtain an alternative outer bound which can be evaluated explicitly. We show that this bound is tight for all the special cases for which the admissible coding rate region is known. The model we study in this paper is more general than all previously reported models on multilevel diversity coding, and the tools we use are new in multiuser information theory.
Foundations and Trends in Communications and Information Theory | 2005
Raymond W. Yeung; Shuo-Yen Robert Li; Ning Cai; Zhen Zhang
Store-and-forward had been the predominant technique for transmitting information through a network until its optimality was refuted by network coding theory. Network coding offers a new paradigm for network communications and has generated abundant research interest in information and coding theory, networking, switching, wireless communications, cryptography, computer science, operations research, and matrix theory.
information theory workshop | 2002
Ning Cai; Raymond W. Yeung
We introduce network error-correcting codes for error correction when a source message is transmitted to a set of receiving nodes on a network. The usual approach in existing networks, namely link-by-link error correction, is a special case of network error correction. The network generalizations of the Hamming bound and the Gilbert-Varshamov bound are derived.
IEEE Transactions on Information Theory | 2011
Ning Cai; Raymond W. Yeung
In the paradigm of network coding, the nodes in a network are allowed to encode the information received from the input links. With network coding, the full capacity of the network can be utilized. In this paper, we propose a model, call the wiretap network, that incorporates information security with network coding. In this model, a collection of subsets of the channels in the network is given, and a wiretapper is allowed to access any one (but not more than one) of these subsets without being able to obtain any information about the message transmitted. Our model includes secret sharing in classical cryptography as a special case. We present a construction of secure linear network codes that can be used provided a certain graph-theoretic condition is satisfied. We also prove the necessity of this condition for the special case that the wiretapper may choose to access any subset of channels of a fixed size. The optimality of our code construction is established for this special case. Finally, we extend our results to the scenario when the wiretapper is allowed to obtain a controlled amount of information about the message.
IEEE Transactions on Information Theory | 1997
Raymond W. Yeung
We present a framework for information inequalities, namely, inequalities involving only Shannons information measures, for discrete random variables. A region in IR(2/sup n/-1), denoted by /spl Gamma/*, is identified to be the origin of all information inequalities involving n random variables in the sense that all such inequalities are partial characterizations of /spl Gamma/*. A product from this framework is a simple calculus for verifying all unconstrained and constrained linear information identities and inequalities which can be proved by conventional techniques. These include all information identities and inequalities of such types in the literature. As a consequence of this work, most identities and inequalities involving a definite number of random variables can now be verified by a software called ITIP which is available on the World Wide Web. Our work suggests the possibility of the existence of information inequalities which cannot be proved by conventional techniques. We also point out the relation between /spl Gamma/* and some important problems in probability theory and information theory.
IEEE Transactions on Information Theory | 1991
Raymond W. Yeung
The author presents a new approach to understanding the underlying mathematical structure of Shannons information measures, which provides answers to the following two questions for any finite number of random variables. (1) For any information-theoretic identity, is there a corresponding set-theoretic identity via the formal substitution of symbols? (2) For any set-theoretic identity, is there a corresponding information-theoretic identity and, if so, in what sense? The author establishes the analogy between information theory and set theory. Therefore, each information-theoretic operation can formally be viewed as a set-theoretic operation and vice versa. This point of view, which the author believes is of fundamental importance has apparently been overlooked in the past by information theorists. As a consequence the I-diagram, which is a geometrical representation of the relationship among the information measures, is introduced. The I-diagram is analogous to the Venn diagram in set theory. The use of the I-diagram is discussed. >
IEEE Transactions on Information Theory | 1997
Zhen Zhang; Raymond W. Yeung
Given n discrete random variables /spl Omega/={X/sub 1/,...,X/sub n/}, associated with any subset /spl alpha/ of {1,2,...,n}, there is a joint entropy H(X/sub /spl alpha//) where X/sub /spl alpha//={X/sub i/: i/spl isin//spl alpha/}. This can be viewed as a function defined on 2/sup {1,2,...,n}/ taking values in [0, +/spl infin/). We call this function the entropy function of /spl Omega/. The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual information implies that this function is two-alternative. These properties are the so-called basic information inequalities of Shannons information measures. An entropy function can be viewed as a 2/sup n/-1-dimensional vector where the coordinates are indexed by the subsets of the ground set {1,2,...,n}. As introduced by Yeng (see ibid., vol.43, no.6, p.1923-34, 1997) /spl Gamma//sub n/ stands for the cone in IR(2/sup n/-1) consisting of all vectors which have all these properties. Let /spl Gamma//sub n/* be the set of all 2/sup n/-1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. A fundamental information-theoretic problem is whether or not /spl Gamma/~/sub n/*=/spl Gamma//sub n/. Here /spl Gamma/~/sub n/* stands for the closure of the set /spl Gamma//sub n/*. We show that /spl Gamma/~/sub n/* is a convex cone, /spl Gamma//sub 2/*=/spl Gamma//sub 2/, /spl Gamma//sub 3/*/spl ne//spl Gamma//sub 3/, but /spl Gamma/~/sub 3/*=/spl Gamma//sub 3/. For four random variables, we have discovered a conditional inequality which is not implied by the basic information inequalities of the same set of random variables. This lends an evidence to the plausible conjecture that /spl Gamma/~/sub n/*/spl ne//spl Gamma//sub n/ for n>3.
international symposium on information theory | 2007
Xijin Yan; Raymond W. Yeung; Zhen Zhang
The capacity problem for general acyclic multi- source multi-sink networks with arbitrary transmission requirements has been studied by L. Song, et al (2003). Specifically, inner and outer bounds of the capacity region were derived respectively in terms of Gamman* and Gamma macrn*, the fundamental regions of the entropy function. In this paper, we show that by carefully bounding the constrained regions in the entropy space, we obtain the exact characterization of the capacity region, thus closing the existing gap between the above inner and outer bounds.