Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Frank Kienle is active.

Publication


Featured researches published by Frank Kienle.


design, automation, and test in europe | 2005

A Synthesizable IP Core for DVB-S2 LDPC Code Decoding

Frank Kienle; Torben Brack; Norbert Wehn

The new standard for digital video broadcast DVB-S2 features low-density parity-check (LDPC) codes as their channel coding scheme. The codes are defined for various code rates with a block size of 64800 which allows a transmission close to the theoretical limits. The decoding of LDPC is an iterative process. For DVB-S2 about 300000 messages are processed and reordered in each of the 30 iterations. These huge data processing and storage requirements are a real challenge for the decoder hardware realization, which has to fulfill the specified throughput of 255 Mbit/s for base station applications. In this paper we show, to the best of our knowledge, the first published IP LDPC decoder core for the DVB-S2 standard. We present a synthesizable IP block based on ST Microelectronics 0.13 /spl mu/m CMOS technology.


personal, indoor and mobile radio communications | 2006

A Synthesizable IP Core for WIMAX 802.16E LDPC Code Decoding

Torben Brack; Matthias Alles; Frank Kienle; Norbert Wehn

The upcoming IEEE WiMax 802.16e standard, also referred to as WirelessMAN (2005), is the next step toward very high throughput wireless backbone architectures, supporting up to 500 Mbps. It features as an advanced channel coding scheme low-density parity-check codes. The decoding of LDPC codes is an iterative process, hence many data have to be exchanged between processing units within each iteration. The variety of the specified codes and the envision of different decoding schedules for different codes pose significant challenges to an LDPC decoder hardware realization. In this paper, we present to the best of our knowledge the first published LDPC decoder architecture capable to process all specified WiMax LDPC codes. Detailed synthesis and communications performance results are shown in addition


design, automation, and test in europe | 2007

Low complexity LDPC code decoders for next generation standards

Torben Brack; Matthias Alles; Timo Lehnigk-Emden; Frank Kienle; Norbert Wehn; Nicola E. L'Insalata; Francesco Rossi; Massimo Rovini; Luca Fanucci

This paper presents the design of low complexity LDPC codes decoders for the upcoming WiFi (IEEE 802.11n), WiMax (IEEE802.16e) and DVB-S2 standards. A complete exploration of the design space spanning from the decoding schedules, the node processing approximations up to the top-level decoder architecture is detailed. According to this search state-of-the-art techniques for a low complexity design have been adopted in order to meet feasible high throughput decoder implementations. An analysis of the standardized codes from the decoder-aware point of view is also given, presenting, for each one, the implementation challenges (multi rates-length codes) and bottlenecks related to the complete coverage of the standards. Synthesis results on a present 65nm CMOS technology are provided on a generic decoder architecture


vehicular technology conference | 2005

Low complexity stopping criterion for LDPC code decoders

Frank Kienle; Norbert Wehn

Low density parity check (LDPC) codes are amongst the most powerful codes known today. They are decoded iteratively by a message-passing algorithm. For this, an inherent stopping criterion exists for decodable blocks based on parity checks. For undecodable blocks, typically, a fixed number of iterations are carried out before the decoding procedure is terminated, which is a waste of energy and time. We present a stopping criterion which detects undecodable blocks in an early stage of the decoding process. The novel stopping criterion works for floating and fixed point implementations. It greatly reduces the average number of required iterations without any loss in communication performance.


design, automation, and test in europe | 2009

A novel LDPC decoder for DVB-S2 IP

Stefan Müller; Manuel Schreger; Marten Kabutz; Matthias Alles; Frank Kienle; Norbert Wehn

In this paper a programmable Forward Error Correction (FEC) IP for a DVB-S2 receiver is presented. It is composed of a Low-Density Parity Check (LDPC), a Bose-Chaudhuri-Hoquenghem (BCH) decoder, and pre- and postprocessing units. Special emphasis is put on LDPC decoding, since it accounts for the most complexity of the IP core by far.


reconfigurable computing and fpgas | 2011

An Energy Efficient FPGA Accelerator for Monte Carlo Option Pricing with the Heston Model

Christian de Schryver; Ivan Shcherbakov; Frank Kienle; Norbert Wehn; Henning Marxen; Anton Kostiuk; Ralf Korn

Today, pricing of derivates (particularly options) in financial institutions is a challenge. Besides the increasing complexity of the products, obtaining fair prices requires more realistic (and therefore complex) models of the underlying asset behavior. Not only due to the increasing costs, energy efficient and accurate pricing of these models becomes more and more important. In this paper we present - to the best of our knowledge - the first FPGA based accelerator for option pricing with the state-of-the-art Heston model. It is based on advanced Monte Carlo simulations. Compared to an 8-core Intel Xeon Server running at 3.07GHz, our hybrid FPGA-CPU-system saves 89% of the energy and provides around twice the speed. The same system reduces the energy consumption per simulation to around 40% of a fully-loaded Nvidia Tesla C2050 GPU. For a three-Virtex-5 chip only accelerator, we expect to achieve the same simulation speed as a Nvidia Tesla C2050 GPU, by consuming less than 3% of the energy at the same time.


IEEE Transactions on Communications | 2011

On Complexity, Energy- and Implementation-Efficiency of Channel Decoders

Frank Kienle; Norbert Wehn; Heinrich Meyr

Future wireless communication systems require efficient and flexible baseband receivers. Meaningful efficiency metrics are key for design space exploration to quantify the algorithmic and the implementation complexity of a receiver. Most of the current established efficiency metrics are based on counting operations, thus neglecting important issues like data and storage complexity. In this paper we introduce suitable energy and area efficiency metrics which resolve the afore-mentioned disadvantages. These are decoded information bit per energy and throughput per area unit. Efficiency metrics are assessed by various implementations of turbo decoders, LDPC decoders and convolutional decoders. An exploration approach is presented, which permit an appropriate benchmarking of implementation efficiency, communications performance, and flexibility trade-offs. Two case studies demonstrate this approach and show that design space exploration should result in various efficiency evaluations rather than a single snapshot metric as done often in state-of-the-art approaches.


design, automation, and test in europe | 2006

Disclosing the LDPC code decoder design space

Torben Brack; Frank Kienle; Norbert Wehn

The design of future communication systems with high throughput demands will become a critical task, especially when sophisticated channel coding schemes have to be applied. LDPC codes are one of the most promising candidates because of their outstanding communications performance. One major problem for a decoder hardware realization is the huge design space composed of many interrelated parameters which enforces drastic design trade-offs. Another important issue is the need for flexibility of such systems. In this paper we illuminate this design space with special emphasis on the strong interrelations of theses parameters. Three design studies are presented to highlight the effects on a generic architecture if some parameters are constraint by a given standard, given technology, and given area constraints


IEEE Transactions on Information Theory | 2010

A Separation Algorithm for Improved LP-Decoding of Linear Block Codes

Akin Tanatmis; Stefan Ruzika; Horst W. Hamacher; Mayur Punekar; Frank Kienle; Norbert Wehn

Maximum likelihood (ML) decoding is the optimal decoding algorithm for arbitrary linear block codes and can be written as an integer programming (IP) problem. Feldman relaxed this IP problem and presented linear programming (LP) based decoding. In this paper, we propose a new separation algorithm to improve the error-correcting performance of LP decoding for binary linear block codes. We use an IP formulation with indicator variables that help in detecting the violated parity checks. We derive Gomory cuts from the IP and use them in our separation algorithm. An efficient method of finding cuts induced by redundant parity checks (RPC) is also proposed. Under certain circumstances we can guarantee that these RPC cuts are valid and cut off the fractional optimal solutions of LP decoding. It is demonstrated on three LDPC codes and two BCH codes that our separation algorithm performs significantly better than LP decoding and belief propagation (BP) decoding.


vehicular technology conference | 2003

Low complexity stopping criteria for UMTS turbo-decoders

Frank Gilbert; Frank Kienle; Norbert Wehn

Turbo-codes are part of the third generation wireless communications system (UMTS). A turbo-decoder consists of two soft-in soft-out component decoders, which exchange information (soft-values) in an iterative process. The number of iterations for decoding strongly depends on the channel characteristic which can change from block to block due to fading. In this paper, we present two new stopping criteria which can be implemented on dedicated hardware or DSP with negligible overhead. The new criteria operate on the sum of the absolute soft output values, calculated after each component decoder and is referred to as sum reliability. We compare the communications performance and average number of iterations of our proposed criteria to other criteria in literature using a fixed-point 8-state turbo-decoder implementation in an UMTS FDD-downlink chain. An analysis of the arithmetic complexity and memory demand yields minimal overhead with excellent performance compared to other stopping criteria.

Collaboration


Dive into the Frank Kienle's collaboration.

Top Co-Authors

Avatar

Norbert Wehn

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar

Torben Brack

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthias Alles

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar

Stefan Ruzika

University of Koblenz and Landau

View shared research outputs
Top Co-Authors

Avatar

Akin Tanatmis

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar

Mayur Punekar

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar

Horst W. Hamacher

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar

Timo Lehnigk-Emden

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Admir Burnic

University of Duisburg-Essen

View shared research outputs
Researchain Logo
Decentralizing Knowledge