Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marc Handlery is active.

Publication


Featured researches published by Marc Handlery.


IEEE Transactions on Information Theory | 2004

A BEAST for prowling in trees

Irina E. Bocharova; Marc Handlery; Rolf Johannesson; Boris D. Kudryashov

When searching for convolutional codes and tailbiting codes of high complexity it is of vital importance to use fast algorithms for computing their weight spectra, which corresponds to finding low-weight paths in their code trellises. This can be efficiently done by a combined search in both forward and backward code trees. A bidirectional efficient algorithm for searching such code trees (BEAST) is presented. For large encoder memories, it is shown that BEAST is significantly more efficient than comparable algorithms. BEAST made it possible to find new convolutional and tailbiting codes that have larger free (minimum) distances than the previously best known codes with the same parameters. Tables of such codes are presented.


IEEE Transactions on Information Theory | 2002

Tailbiting codes obtained via convolutional codes with large active distance-slopes

Irina E. Bocharova; Marc Handlery; Rolf Johannesson; Boris D. Kudryashov

The slope of the active distances is an important parameter when investigating the error-correcting capability of convolutional codes and the distance behavior of concatenated convolutional codes. The slope of the active distances is equal to the minimum average weight cycle in the state-transition diagram of the encoder. A general upper bound on the slope depending on the free distance of the convolutional code and new upper bounds on the slope of special classes of binary convolutional codes are derived. Moreover, a search technique, resulting in new tables of rate R=1/2 and rate R=1/3 convolutional encoders with high memories and large active distance-slopes is presented. Furthermore, we show that convolutional codes with large slopes can be used to obtain new tailbiting block codes with large minimum distances. Tables of rate R=1/2 and rate R=1/3 tailbiting codes with larger minimum distances than the best previously known quasi-cyclic codes are given. Two new tailbiting codes also have larger minimum distances than the best previously known binary linear block codes with same size and length. One of them is also superior in terms of minimum distance to any previously known binary nonlinear block code with the same set of parameters.


IEEE Transactions on Information Theory | 2005

BEAST decoding of block codes obtained via convolutional codes

Irina E. Bocharova; Marc Handlery; Rolf Johannesson; Boris D. Kudryashov

BEAST is a bidirectional efficient algorithm for searching trees. In this correspondence, BEAST is extended to maximum-likelihood (ML) decoding of block codes obtained via convolutional codes. First it is shown by simulations that the decoding complexity of BEAST is significantly less than that of the Viterbi algorithm. Then asymptotic upper bounds on the BEAST decoding complexity for three important ensembles of codes are derived. They verify BEASTs high efficiency compared to other algorithms. For high rates, the new asymptotic bound for the best ensemble is in fact better than previously known bounds.


IEEE Transactions on Communications | 2003

Boosting the error performance of suboptimal tailbiting decoders

Marc Handlery; Rolf Johannesson; Victor V. Zyablov

Tailbiting is an attractive method to terminate convolutional codes without reducing the code rate. Maximum-likelihood and exact a posteriori probability decoding of tailbiting codes implies, however, a large computational complexity. Therefore, suboptimal decoding methods are often used in practical coding schemes. It is shown that suboptimal decoding methods work better when the slope of the active distances of the generating convolutional encoder is large. Moreover, it is shown that considering quasi-cyclic shifts of the received channel output can increase the performance of suboptimal tailbiting decoders. The findings are most relevant to tailbiting codes where the number of states is not small relative to the block length.


international symposium on information theory | 2002

How to efficiently find the minimum distance of tailbiting codes

Irina E. Bocharova; Boris D. Kudryashov; Marc Handlery; Rolf Johannesson

A bidirectional algorithm for computing spectral coefficients of convolutional and tailbiting codes is presented and used to obtain new codes.


Problems of Information Transmission | 2002

Encoder and Distance Properties of Woven Convolutional Codes with One Tailbiting Component Code

Marc Handlery; Rolf Johannesson; Viktor V. Zyablov

Woven convolutional codes with one tailbiting component code are studied and their generator matrices are given. It is shown that, if the constituent encoders are identical, a woven convolutional encoder with an outer convolutional warp and one inner tailbiting encoder (WIT) generates the same code as a woven convolutional encoder with one outer tailbiting encoder and an inner convolutional warp (WOT). However, for rate Rtb < 1 tailbiting encoders, the WOT cannot be an encoder realization with a minimum number of delay elements. Lower bounds on the free distance and active distances of woven convolutional codes with a tailbiting component code are given. These bounds are equal to those for woven codes consisting exclusively of unterminated convolutional codes. However, for woven convolutional codes with one tailbiting component code, the conditions for the bounds to hold are less strict.


Problems of Information Transmission | 2002

A Distance Measure Tailored to Tailbiting Codes

Marc Handlery; Stefan Höst; Rolf Johannesson; Victor V. Zyablov

The error-correcting capability of tailbiting codes generated by convolutional encoders is described. In order to obtain a description beyond what the minimum distance dmin of the tailbiting code implies, the active tailbiting segment distance is introduced. The description of correctable error patterns via active distances leads to an upper bound on the decoding block error probability of tailbiting codes. The necessary length of a tailbiting code so that its minimum distance is equal to the free distance dfree of the convolutional code encoded by the same encoder is easily obtained from the active tailbiting segment distance. This is useful when designing and analyzing concatenated convolutional codes with component codes that are terminated using the tailbiting method. Lower bounds on the active tailbiting segment distance and an upper bound on the ratio between the tailbiting length and memory of the convolutional generator matrix such that dmin equals dfree are derived. Furthermore, affine lower bounds on the active tailbiting segment distance suggest that good tailbiting codes are generated by convolutional encoders with large active-distance slopes.


Problems of Information Transmission | 2002

Distance Approach to Window Decoding

Marc Handlery; Rolf Johannesson; Viktor V. Zyablov

In convolutional coding, code sequences have infinite length; thus, a maximum-likelihood decoder implies an infinite delay. Due to memory and delay constraints in practical coding schemes, convolutional codes often are either terminated or decoded by a window decoder. When a window decoder is used, the convolutional code sequence is not terminated; instead, the window decoder estimates information digits after receiving a finite number of noise-corrupted code symbols, thereby keeping the decoding delay short. An exact characterization of the error-correcting capability of window decoded convolutional codes is given by using active distances of convolutional codes.


IEEE Transactions on Information Theory | 2004

On error exponents for woven convolutional codes with one tailbiting component code

Marc Handlery; Rolf Johannesson; Victor V. Zyablov

An error exponent for woven convolutional codes (WCC) with one tailbiting component code is derived. This error exponent is compared with that of the original WCC. It is shown that for WCC with outer warp, a better error exponent is obtained if the inner code is terminated with the tailbiting method. Furthermore, it is shown that the decoding error probability decreases exponentially with the square of the memory of the constituent convolutional encoders, while the decoding complexity grows exponentially only with the memory.


international symposium on information theory | 2002

Convolutional codes with large slopes yield better tailbiting codes

Irina E. Bocharova; Boris D. Kudryashov; Marc Handlery; Rolf Johannesson

Upper bounds on the slope of the active distances for convolutional codes are given. Convolutional codes with large slopes are used to obtain tables of new tailbiting block codes.

Collaboration


Dive into the Marc Handlery's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Boris D. Kudryashov

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Irina E. Bocharova

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Victor V. Zyablov

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Irina E. Bocharova

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James L. Massey

École Polytechnique Fédérale de Lausanne

View shared research outputs
Researchain Logo
Decentralizing Knowledge