Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Te Sun Han is active.

Publication


Featured researches published by Te Sun Han.


IEEE Transactions on Information Theory | 1981

A new achievable rate region for the interference channel

Te Sun Han; Kingo Kobayashi

A new achievable rate region for the general interference channel which extends previous results is presented and evaluated. The technique used is a generalization of superposition coding to the multivariable case. A detailed computation for the Gaussian channel case clarifies to what extent the new region improves previous ones. The capacity of a class of Gaussian interference channels is also established.


IEEE Transactions on Information Theory | 1994

A general formula for channel capacity

Sergio Verdú; Te Sun Han

A formula for the capacity of arbitrary single-user channels without feedback (not necessarily information stable, stationary, etc.) is proved. Capacity is shown to equal the supremum, over all input processes, of the input-output inf-information rate defined as the liminf in probability of the normalized information density. The key to this result is a new converse approach based on a simple new lower bound on the error probability of m-ary hypothesis tests among equiprobable hypotheses. A necessary and sufficient condition for the validity of the strong converse is given, as well as general expressions for /spl epsiv/-capacity. >


international symposium on information theory | 1993

Approximation theory of output statistics

Te Sun Han; Sergio Verdú

Given a channel and an input process we study the minimum randomness of those input processes whose output statistics approximate the original output statistics with arbitrary accuracy. We introduce the notion of resolvability of a channel, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process. We obtain a general formula for resolvability which holds regardless of the channel memory structure. We show that, for most channels, resolvability is equal to Shannon capacity. By-products of our analysis are a general formula for the minimum achievable (fixed-length) source coding rate of any finite-alphabet source, and a strong converse of the identification coding theorem, which holds for any channel that satisfies the strong converse of the channel coding theorem.


IEEE Transactions on Information Theory | 1998

Statistical inference under multiterminal data compression

Te Sun Han; Shun-ichi Amari

This paper presents a survey of the literature on the information-theoretic problems of statistical inference under multiterminal data compression with rate constraints. Significant emphasis is put on problems: (1) multiterminal hypothesis testing, (2) multiterminal parameter estimation and (3) multiterminal pattern classification, in either case of positive rates or zero rates. In addition, the paper includes three new results, i.e., the converse theorems for all problems of multiterminal hypothesis testing, multiterminal parameter estimation, and multiterminal pattern classification at the zero rate.


IEEE Transactions on Information Theory | 1983

On source coding with side information via a multiple-access channel and related problems in multi-user information theory

Rudolf Ahlswede; Te Sun Han

A simple proof of the coding theorem for the multiple-access channel (MAC) with arbitrarily correlated sources (DMCS) of Cover-El Carnal-Salehi, which includes the results of Ahlswede for the MAC and of Slepian-Wolf for the DMCS and the MAC as special cases, is first given. A coding theorem is introduced and established for another type of source-channel matching problem, i.e., a system of source coding with side information via a MAC, which can be regarded as an extension of the Ahlswede-Korner-Wyner type noiseless coding system. This result is extended to a more general system with several principal sources and several side information sources subject to cross observation at the encoders in the sense of Han. The regions are shown to be optimal in special situations. Duecks example shows that this is in general not the case for the result of Cover-El Gamal-Salehi and the present work. In another direction, the achievable rate region for the module-two sum source network found by Korner-Marton is improved. Finally, some ideas about a new approach to the source-channel matching problem in multi-user communication theory are presented. The basic concept is that of a correlated channel code. The approach leads to several new coding problems.


IEEE Transactions on Information Theory | 1980

A unified achievable rate region for a general class of multiterminal source coding systems

Te Sun Han; Kingo Kobayashi

A unified treatment of a large class of multiterminal noiseless source coding problems including all previously studied situations is presented. A unified achievable rate region is established for this class by a coding technique based on the typical sequence criterion. This region is tight for all the previously studied situations.


IEEE Transactions on Information Theory | 1994

Generalizing the Fano inequality

Te Sun Han; Sergio Verdú

The Fano inequality gives a lower bound on the mutual information between two random variables that take values on an M-element set, provided at least one of the random variables is equiprobable. The authors show several simple lower bounds on mutual information which do not assume such a restriction. In particular, this ran be accomplished by replacing log M with the infinite-order Renyi entropy in the Fano inequality. Applications to hypothesis testing are exhibited along with bounds on mutual information in terms of the a priori and a posteriori error probabilities. >


IEEE Transactions on Information Theory | 1994

Universal coding for the Slepian-Wolf data compression system and the strong converse theorem

Yasutada Oohama; Te Sun Han

Universal coding for the Slepian-Wolf (1973) data compression system is considered. We shall demonstrate based on a simple observation that the error exponent given by Csiszar and Korner (1980) for the universal coding system can strictly be sharpened in general for a region of relatively higher rates. This kind of observation can be carried over also to the case of lower rates outside the Slepian-Wolf region, which establishes the strong converse along with the optimal exponent. >


IEEE Transactions on Information Theory | 1989

Statistical inference under multiterminal rate restrictions: a differential geometric approach

Shun-ichi Amari; Te Sun Han

A statistical inference problem for a two-terminal information source emitting mutually correlated signals X and Y is treated. Let sequences X/sup n/ and Y/sup n/ of n independent observations be encoded independently of each other into message sets M/sub X/ and M/sub Y/ at rates R/sub 1/ and R/sub 2/ per letter, respectively. This compression causes a loss of the statistical information available for testing hypotheses concerning X and Y. The loss of statistical information is evaluated as a function of the amounts R/sub 1/ and R/sub 2/ of the Shannon information. A complete solution is given in the case of asymptotically complete data compression, R/sub 1/, R/sub 2/ to 0 as n to infinity . It is shown that the differential geometry of the manifold of all probability distributions plays a fundamental role in this type of multiterminal problem connecting Shannon information and statistical information. A brief introduction to the dually coupled e-affine and m-affine connections together with e-flatness and m-flatness is given. >


IEEE Transactions on Information Theory | 1995

Parameter estimation with multiterminal data compression

Te Sun Han; Shun-ichi Amari

The multiterminal estimation theory deals with an entirely novel problem which takes place in the void between information theory and statistics, that is, what amount of Fisher information can be attained under a restriction on the amount of Shannon information. The key idea here is the indivisible fusion of the information-theoretic universal coding problem and the statistical maximum-likelihood parameter estimation problem. The main result is the explicit establishment of maximum-likelihood estimators attainable under the rate-constrained universal coding scheme, which is shown to have a variance equal to the inverse of the Fisher information. This may be regarded as giving a multiterminal generalization of the usual Cramer-Rao bound. Relevant properties and examples of these maximum-likelihood estimators are also shown.

Collaboration


Dive into the Te Sun Han's collaboration.

Top Co-Authors

Avatar

Shun-ichi Amari

RIKEN Brain Science Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kingo Kobayashi

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Marat V. Burnashev

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Fumio Kanaya

Shonan Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hiroshi Nagaoka

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Yasutada Oohama

University of Electro-Communications

View shared research outputs
Researchain Logo
Decentralizing Knowledge