Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brian L. Hughes is active.

Publication


Featured researches published by Brian L. Hughes.


international symposium on information theory | 1995

A new universal random coding bound for the multiple-access channel

Yu-Sun Liu; Brian L. Hughes

The minimum average error probability achievable by block codes on the two-user multiple-access channel is investigated. A new exponential upper bound is found which can be achieved universally for all discrete memoryless multiple-access channels with given input and output alphabets. It is shown that the exponent of this bound is greater than or equal to those of previously known bounds. Moreover, examples are given where the new exponent is strictly larger.


IEEE Transactions on Information Theory | 1991

On the error probability of signals in additive white Gaussian noise

Brian L. Hughes

A new upper bound to the probability of error in detecting one of M equally probable signals in additive white Gaussian noise is presented. This bound is easy to calculate, can be applied to any signal set. It is always better than the union and minimum distance bounds. Examples demonstrate the use of the bound. >


IEEE Transactions on Information Theory | 1988

The capacity of a vector Gaussian arbitrarily varying channel

Brian L. Hughes; Prakash Narayan

The random coding capacity of a vector Gaussian arbitrarily varying channel (VGAVC) is determined, along with a simple general method for computing this capacity. The VGAVC is a discrete-time memoryless vector channel with an input power constraint and additive Gaussian noise that is further corrupted by an additive jamming signal. The statistics of this jamming signal are unknown and can be arbitrary, subject only to a power constraint. >


IEEE Transactions on Information Theory | 1996

Nearly optimal multiuser codes for the binary adder channel

Brian L. Hughes; A.B. Cooper

Coding schemes for the T-user binary adder channel are investigated. Recursive constructions are given for two families of mixed-rate, multiuser codes. It is shown that these basic codes can be combined by time-sharing to yield codes approaching most rates in the T-user capacity region. In particular, the best codes constructed herein achieve a sum-rate, R/sub 1/+...+R/sub T/, which is higher than all previously reported codes for almost every T and is within 0.547-bit-per-channel use of the information-theoretic limit. Extensions to a T-user, Q-frequency adder channel are also discussed.


international symposium on information theory | 1993

The smallest list for the arbitrarily varying channel

Brian L. Hughes

The capacity of the discrete memoryless arbitrarily varying channel (AVC) is investigated for deterministic list codes with fixed list size L. For every AVC with positive random code capacity C/sub r/, a nonnegative integer M called the symmetrizability is defined. For the average probability of error criterion, it is shown that the list capacity is given by C(L)=C/sub r/ for L>M and C(L)=0 otherwise. Bounds are given which relate C/sub r/ and M. Also, explicit formulas for C(L) are given for a family of noiseless, additive AVCs.


IEEE Transactions on Information Theory | 1996

On error exponents for arbitrarily varying channels

Brian L. Hughes; Tony G. Thomas

The minimum probability of error achievable by random codes on the arbitrarily varying channel (AVC) is investigated. New exponential error bounds are found and applied to the AVC with and without input and state constraints. Also considered is a simple subclass of random codes, called randomly modulated codes, in which encoding and decoding operations are separate from code randomization. A universal coding theorem is proved which shows the existence of randomly modulated codes that achieve the same error bounds as fully random codes for all AVCs.


IEEE Transactions on Information Theory | 1991

Exponential error bounds for random codes on Gaussian arbitrarily varying channels

Tony G. Thomas; Brian L. Hughes

The main objective is to develop exponential bounds to the best error probability achievable with random coding on the Gaussian arbitrarily varying channel (GAVC) in the one case where a (strong) capacity exists (i.e., with peak time-averaged power constraints on both the transmitter and interference). The GAVC models a channel corrupted by thermal noise and by an unknown interfering signal of bounded power. The upper and lower bounds to the best error probability achievable on this channel with random coding are presented. The asymptotic exponents of these bounds agree in a range of rates near capacity. The exponents are universally larger than the corresponding exponents for the discrete-time Gaussian channel with the same capacity. It is further shown that the decoder can be taken to be the minimum Euclidean distance rule at all rates less than capacity. >


information theory workshop | 1989

An asymptotically optimal random modem and detector for robust communication

Brian L. Hughes; Murad Hizlan

Coherent communication over a waveform channel corrupted by thermal noise and by an unknown and arbitrary interfering signal of bounded power is considered. For a fixed encoder, a random modulator/demodulator (modem) and detector are derived. They asymptotically minimize the worst-case error probability as the blocklength of the encoder becomes large. This optimal modem is independent of the encoder, and the optimal detector is the standard correlation receiver. A simple upper bound to the performance of any encoder when used with the optimal modem and detector is presented. These results provide a benchmark with which the performance of spread-spectrum modems and robust detection rules can be compared. >


IEEE Transactions on Communications | 1991

On the optimality of direct sequence for arbitrary interference rejection

Murad Hizlan; Brian L. Hughes

Communication over a waveform channel corrupted by additive white Gaussian noise, and by an unknown and arbitrary interfering signal of bounded power is considered. For this channel, the authors derive an upper bound to the worst case error probability of direct-sequence spread spectrum communication with a correlation receiver, and also a lower bound applicable to any binary signaling technique and any receiver. By comparing these two bounds, it is shown that, if a small error probability is required, then no other binary signaling scheme or receiver can substantially improve upon the performance of direct-sequence with a correlation receiver for the same power and bandwidth. >


international symposium on information theory | 1993

Capacity and Coding for T Active Users Out of M on the Collision Channel

Brian L. Hughes

The problem of designing codes for M users that permit any T < M users to transmit at the same time is investigated for the collision channel. Twelve communication problems are considered that vary according to the degree of synchronization among users, the receivers knowledge of the active users, and the desired reliability of the code. For each problem, the T-of-M user capacity region is determined and constructive coding schemes that approach any rate in this region are presented. Applications to random access communications are discussed.

Collaboration


Dive into the Brian L. Hughes's collaboration.

Top Co-Authors

Avatar

A.B. Cooper

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Tony G. Thomas

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Murad Hizlan

Cleveland State University

View shared research outputs
Top Co-Authors

Avatar

Yu-Sun Liu

National Taipei University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge