Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Roger Labahn is active.

Publication


Featured researches published by Roger Labahn.


Neural Computation | 2012

Design strategies for weight matrices of echo state networks

Tobias Strauss; Welf Wustlich; Roger Labahn

This article develops approaches to generate dynamical reservoirs of echo state networks with desired properties reducing the amount of randomness. It is possible to create weight matrices with a predefined singular value spectrum. The procedure guarantees stability (echo state property). We prove the minimization of the impact of noise on the training process. The resulting reservoir types are strongly related to reservoirs already known in the literature. Our experiments show that well-chosen input weights can improve performance.


Discrete Mathematics | 1994

Quick gossiping by telegraphs

Roger Labahn; Ingo Warnke

Abstract We consider gossiping in the complete doubly directed graph, i.e. telegrams arranged in rounds are used to exchange information between n points, each having one initial item. It is proved that at least 1.44·log 2 n rounds are needed to inform everybody about those n distinct items of information.


Discrete Mathematics | 1995

Kernels of minimum size gossip schemes

Roger Labahn

Abstract The main part of gossip schemes are the kernels of their minimal orders. We give a complete characterization of all kernels that may appear in gossip schemes on simple graphs with a minimum number of calls. As consequences we prove several results on gossip schemes, e.g. the minimum number of rounds of a gossip scheme with a minimum number of calls is computed. Moreover, in the new context we give proofs of known results, e.g. the well-known four-cycle theorem. In the last part, we deal with order theoretic questions for such kernel posets. After describing all p -grid-kernels in terms of permutations and subsets, isomorphism is investigated and they are enumerated. Then we compute the order dimension and the jump number of all possible kernels, and finally, we show how to determine the numbers of their linear extensions.


Discrete Mathematics | 1993

Information flows on hypergraphs

Roger Labahn

Abstract We consider a generalization of the well-known gossip problem for hypergraphs: In a given set V of points, let each point of a subset X ⊆ V know a unit of information which is not known by any other point. The points of V communicate by a sequence of k -party conference calls during which the k points exchange all the information known to them, and none of which are redundant. This exchange of information stops when every point of a subset Y ⊆ V knows all the initial units of information. We give both lower and upper bounds on the number of calls in such a sequence, and we present a structural description of such processes.


Neural Networks | 2016

Regular expressions for decoding of neural network outputs

Tobias Strauß; Gundram Leifert; Tobias Grüning; Roger Labahn

This article proposes a convenient tool for decoding the output of neural networks trained by Connectionist Temporal Classification (CTC) for handwritten text recognition. We use regular expressions to describe the complex structures expected in the writing. The corresponding finite automata are employed to build a decoder. We analyze theoretically which calculations are relevant and which can be avoided. A great speed-up results from an approximation. We conclude that the approximation most likely fails if the regular expression does not match the ground truth which is not harmful for many applications since the low probability will be even underestimated. The proposed decoder is very efficient compared to other decoding methods. The variety of applications reaches from information retrieval to full text recognition. We refer to applications where we integrated the proposed decoder successfully.


Journal of Machine Learning Research | 2016

Cells in multidimensional recurrent neural networks

Gundram Leifert; Tobias Strauß; Tobias Grüning; Welf Wustlich; Roger Labahn


arXiv: Computer Vision and Pattern Recognition | 2014

CITlab ARGUS for historical handwritten documents.

Gundram Leifert; Tobias Strauß; Tobias Grüning; Roger Labahn


document analysis systems | 2018

READ-BAD: A New Dataset and Evaluation Scheme for Baseline Detection in Archival Documents

Tobias Grüning; Roger Labahn; Markus Diem; Florian Kleber; Stefan Fiel


arXiv: Computer Vision and Pattern Recognition | 2014

CITlab ARGUS for historical data tables

Gundram Leifert; Tobias Grüning; Tobias Strauß; Roger Labahn


arXiv: Information Retrieval | 2018

System Description of CITlab's Recognition & Retrieval Engine for ICDAR2017 Competition on Information Extraction in Historical Handwritten Records.

Tobias Strauß; Max Weidemann; Johannes Michael; Gundram Leifert; Tobias Grüning; Roger Labahn

Collaboration


Dive into the Roger Labahn's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Florian Kleber

Vienna University of Technology

View shared research outputs
Top Co-Authors

Avatar

Markus Diem

Vienna University of Technology

View shared research outputs
Top Co-Authors

Avatar

Stefan Fiel

Vienna University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge