Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rolf Johannesson is active.

Publication


Featured researches published by Rolf Johannesson.


IEEE Transactions on Information Theory | 1993

A linear algebra approach to minimal convolutional encoders

Rolf Johannesson; Zhe-Xian Wan

The authors review the work of G.D. Forney, Jr., on the algebraic structure of convolutional encoders upon which some new results regarding minimal convolutional encoders rest. An example is given of a basic convolutional encoding matrix whose number of abstract states is minimal over all equivalent encoding matrices. However, this encoding matrix can be realized with a minimal number of memory elements neither in controller canonical form nor in observer canonical form. Thus, this encoding matrix is not minimal according to Forneys definition of a minimal encoder. To resolve this difficulty, the following three minimality criteria are introduced: minimal-basic encoding matrix, minimal encoding matrix, and minimal encoder. It is shown that all minimal-basic encoding matrices are minimal and that there exist minimal encoding matrices that are not minimal-basic. Several equivalent conditions are given for an encoding matrix to be minimal. It is proven that the constraint lengths of two equivalent minimal-basic encoding matrices are equal one by one up to a rearrangement. All results are proven using only elementary linear algebra. >


IEEE Transactions on Information Theory | 1999

Active distances for convolutional codes

Stefan Höst; Rolf Johannesson; K.Sh. Zigangirov; Victor V. Zyablov

A family of active distance measures for general convolutional codes is defined. These distances are generalizations of the extended distances introduced by Thommesen and Justesen (1983) for unit memory convolutional codes. It is shown that the error correcting capability of a convolutional code is determined by the active distances. The ensemble of periodically time-varying convolutional codes is defined and lower bounds on the active distances are derived for this ensemble. The active distances are very useful in the analysis of concatenated convolutional encoders.


IEEE Transactions on Information Theory | 2002

Woven convolutional codes .I. Encoder properties

Stefan Höst; Rolf Johannesson; Victor V. Zyablov

Encoders for convolutional codes with large free distances can be constructed by combining several less powerful convolutional encoders. This paper is devoted to constructions in which the constituent convolutional codes are woven together in a manner that resembles the structure of a fabric. The general construction is called twill and it is described together with two special cases, viz., woven convolutional encoders with outer warp and with inner warp. The woven convolutional encoders inherit many of their structural properties, such as minimality and catastrophicity, from their constituent encoders. For all three types of woven convolutional codes upper and lower bounds on their free distances as well as lower bounds on the active distances of their encoders are derived.


IEEE Transactions on Information Theory | 2004

A BEAST for prowling in trees

Irina E. Bocharova; Marc Handlery; Rolf Johannesson; Boris D. Kudryashov

When searching for convolutional codes and tailbiting codes of high complexity it is of vital importance to use fast algorithms for computing their weight spectra, which corresponds to finding low-weight paths in their code trellises. This can be efficiently done by a combined search in both forward and backward code trees. A bidirectional efficient algorithm for searching such code trees (BEAST) is presented. For large encoder memories, it is shown that BEAST is significantly more efficient than comparable algorithms. BEAST made it possible to find new convolutional and tailbiting codes that have larger free (minimum) distances than the previously best known codes with the same parameters. Tables of such codes are presented.


IEEE Transactions on Information Theory | 2012

Searching for Voltage Graph-Based LDPC Tailbiting Codes With Large Girth

Irina E. Bocharova; Florian Hug; Rolf Johannesson; Boris D. Kudryashov; Roman V. Satyukov

The relation between parity-check matrices of quasi-cyclic (QC) low-density parity-check (LDPC) codes and biadjacency matrices of bipartite graphs supports searching for powerful LDPC block codes. Using the principle of tailbiting, compact representations of bipartite graphs based on convolutional codes can be found. Bounds on the girth and the minimum distance of LDPC block codes constructed in such a way are discussed. Algorithms for searching iteratively for LDPC block codes with large girth and for determining their minimum distance are presented. Constructions based on all-one matrices, Steiner Triple Systems, and QC block codes are introduced. Finally, new QC regular LDPC block codes with girth up to 24 are given.


IEEE Transactions on Information Theory | 1998

Some structural properties of convolutional codes over rings

Rolf Johannesson; Zhe-Xian Wan; Emma Wittenmark

Convolutional codes over rings have been motivated by phase-modulated signals. Some structural properties of the generator matrices of such codes are presented. Successively stronger notions of the invertibility of generator matrices are studied, and a new condition for a convolutional code over a ring to be systematic is given and shown to be equivalent to a condition given by Massey and Mittelholzer (1990). It is shown that a generator matrix that can be decomposed into a direct sum is basic, minimal, and noncatastrophic if and only if all generator matrices for the constituent codes are basic, minimal, and noncatastrophic, respectively. It is also shown that if a systematic generator matrix can be decomposed into a direct sum, then all generator matrices of the constituent codes are systematic, but that the converse does not hold. Some results on convolutional codes over Z(p/sup e/) are obtained.


IEEE Transactions on Information Theory | 1999

Optimal and near-optimal encoders for short and moderate-length tail-biting trellises

Per Ståhl; John B. Anderson; Rolf Johannesson

The results of an extensive search for short and moderate length polynomial convolutional encoders for time-invariant tail-biting representations of block codes at rates R=1/4, 1/3, 1/2, and 2/3 are reported. The tail-biting representations found are typically as good as the best known block codes.


IEEE Transactions on Information Theory | 2002

A note on tailbiting codes and their feedback encoders

Per Ståhl; John B. Anderson; Rolf Johannesson

Tailbiting codes encoded by feedback convolutional encoders are studied. A condition for when tailbiting will work is given and it is described how the encoder starting state can be obtained for feedback encoders in both controller and observer canonical forms. Finally, results from a search for systematic feedback encoders that encode tailbiting codes with good decoding bit error probabilities are presented.


IEEE Transactions on Information Theory | 2002

Tailbiting codes: bounds and search results

Irina E. Bocharova; Rolf Johannesson; Boris D. Kudryashov; Per Ståhl

Tailbiting trellis representations of linear block codes with an arbitrary sectionalization of the time axis are studied. The notations of regular and irregular tailbiting codes are introduced and their maximal state complexities are lower-bounded. The asymptotic behavior of the derived bound is investigated. Furthermore, for regular tailbiting codes the product state complexity is lower-bounded. Tables of new tailbiting trellis representations of linear block codes of rates 1/2, 1/3, and 1/4 are presented. Almost all found trellises are optimal in the sense of the new bound on the state complexity and for most codes with nonoptimal trellises there exist time-varying trellises which are optimal. Five of our newly found tailbiting codes are better than the previously known linear codes with the same parameters. Four of them are also superior to any previously known nonlinear code with the same parameters. Also, more than 40 other quasi-cyclic codes have been found that improve the parameter set of previously known quasi-cyclic codes.


transactions on emerging telecommunications technologies | 2004

BEAST decoding for block codes

Irina E. Bocharova; Rolf Johannesson; Boris D. Kudryashov; Maja Loncar

BEAST is a Bidirectional Efficient Algorithm for Searching code Trees. In this paper, it is used for decoding block codes over a binary-input memoryless channel. If no constraints are imposed on the decoding complexity (in terms of the number of visited nodes during the search), BEAST performs maximum-likelihood (ML) decoding. At the cost of a negligible performance degradation, BEAST can be constrained to perform almost-ML decoding with significantly reduced complexity. The benchmark for the complexity assessment is the number of nodes visited by the Viterbi algorithm operating on the minimal trellis of the code. The decoding complexity depends on the trellis structure of a given code, which is illustrated by three different forms of the generator matrix for the (24, 12, 8) Golay code. Simulation results that assess the error-rate performance and the decoding complexity of BEAST are presented for two longer codes.

Collaboration


Dive into the Rolf Johannesson's collaboration.

Top Co-Authors

Avatar

Boris D. Kudryashov

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Irina E. Bocharova

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Victor V. Zyablov

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge