Falk Hüffner
Technical University of Berlin
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Falk Hüffner.
Theory of Computing Systems \/ Mathematical Systems Theory | 2005
Jens Gramm; Jiong Guo; Falk Hüffner; Rolf Niedermeier
We present efficient fixed-parameter algorithms for the NP-complete edge modification problems Cluster Editing and Cluster Deletion. Here, the goal is to make the fewest changes to the edge set of an input graph such that the new graph is a vertex-disjoint union of cliques. Allowing up to k edge additions and deletions (Cluster Editing), we solve this problem in O(2.27k + |V|3) time; allowing only up to k edge deletions (Cluster Deletion), we solve this problem in O(1.77k + |V|3) time. The key ingredients of our algorithms are two easy to implement bounded search tree algorithms and an efficient polynomial-time reduction to a problem kernel of size O(k3). This improves and complements previous work. Finally, we discuss further improvements on search tree sizes using computer-generated case distinctions.
european symposium on algorithms | 2003
Jens Gramm; Jiong Guo; Falk Hüffner; Rolf Niedermeier
We present a (seemingly first) framework for an automated generation of exact search tree algorithms for NP-hard problems. The purpose of our approach is two-fold-rapid development and improved upper bounds. Many search tree algorithms for various problems in the literature are based on complicated case distinctions. Our approach may lead to a much simpler process of developing and analyzing these algorithms. Moreover, using the sheer computing power of machines it may also lead to improved upper bounds on search tree sizes (i.e., faster exact solving algorithms) in comparison with previously developed hand-made search trees.
algorithmic game theory | 2010
Falk Hüffner; Christian Komusiewicz; Hannes Moser; Rolf Niedermeier
We initiate the first systematic study of the NP-hard Cluster Vertex Deletion (CVD) problem (unweighted and weighted) in terms of fixed-parameter algorithmics. In the unweighted case, one searches for a minimum number of vertex deletions to transform a graph into a collection of disjoint cliques. The parameter is the number of vertex deletions. We present efficient fixed-parameter algorithms for CVD applying the fairly new iterative compression technique. Moreover, we study the variant of CVD where the maximum number of cliques to be generated is prespecified. Here, we exploit connections to fixed-parameter algorithms for (weighted) Vertex Cover.
Lecture Notes in Computer Science | 2004
Jiong Guo; Falk Hüffner; Rolf Niedermeier
Based on a series of known and new examples, we propose the generalized setting of “distance from triviality” measurement as a reasonable and prospective way of determining useful structural problem parameters in analyzing computationally hard problems. The underlying idea is to consider tractable special cases of generally hard problems and to introduce parameters that measure the distance from these special cases. In this paper we present several case studies of distance from triviality parameterizations (concerning Clique, Power Dominating Set, Set Cover, and Longest Common Subsequence) that exhibit the versatility of this approach to develop important new views for computational complexity analysis.
Algorithmica | 2004
Jens Gramm; Jiong Guo; Falk Hüffner; Rolf Niedermeier
Abstract We present a framework for an automated generation of exact search tree algorithms for NP-hard problems. The purpose of our approach is twofold—rapid development and improved upper bounds. Many search tree algorithms for various problems in the literature are based on complicated case distinctions. Our approach may lead to a much simpler process of developing and analyzing these algorithms. Moreover, using the sheer computing power of machines it may also lead to improved upper bounds on search tree sizes (i.e., faster exact solving algorithms) in comparison with previously developed “hand-made” search trees. Among others, such an example is given with the NP-complete Cluster Editing problem (also known as Correlation Clustering on complete unweighted graphs), which asks for the minimum number of edge additions and deletions to create a graph which is a disjoint union of cliques. The hand-made search tree for Cluster Editing had worst-case size O(2.27k), which now is improved to O(1.92k) due to our new method. (Herein, k denotes the number of edge modifications allowed.)
international conference on algorithms and complexity | 2006
Michael Dom; Jiong Guo; Falk Hüffner; Rolf Niedermeier; Anke Truß
Complementing recent progress on classical complexity and polynomial-time approximability of feedback set problems in (bipartite) tournaments, we extend and partially improve fixed-parameter tractability results for these problems. We show that Feedback Vertex Set in tournaments is amenable to the novel iterative compression technique. Moreover, we provide data reductions and problem kernels for Feedback Vertex Set and Feedback Arc Set in tournaments, and a depth-bounded search tree for Feedback Arc Set in bipartite tournaments based on a new forbidden subgraph characterization.
international conference on algorithms and complexity | 2003
Jens Gramm; Jiong Guo; Falk Hüffner; Rolf Niedermeier
We present efficient fixed-parameter algorithms for the NP-complete edge modification problems CLUSTER EDITING and CLUSTER DELETION. Here, the goal is to make the fewest changes to the edge set of an input graph such that the new graph is a vertex-disjoint union of cliques. Allowing up to k edge additions and deletions (CLUSTER EDITING), we solve this problem in O(2.27k + |V|3) time; allowing only up to k edge deletions (CLUSTER DELETION), we solve this problem in O(1.77k + |V|3) time. The key ingredients of our algorithms are two easy to implement bounded search tree algorithms and a reduction to a problem kernel of size O(k3). This improves and complements previous work.
ACM Journal of Experimental Algorithms | 2009
Jens Gramm; Jiong Guo; Falk Hüffner; Rolf Niedermeier
To cover the edges of a graph with a minimum number of cliques is an NP-hard problem with many applications. For this problem we develop efficient and effective polynomial-time data reduction rules that, combined with a search tree algorithm, allow for exact problem solutions in competitive time. This is confirmed by experiments with real-world and synthetic data. Moreover, we prove the fixed-parameter tractability of covering edges by cliques.
Algorithmica | 2008
Falk Hüffner; Sebastian Wernicke; Thomas Zichner
Abstract Color-coding is a technique to design fixed-parameter algorithms for several NP-complete subgraph isomorphism problems. Somewhat surprisingly, not much work has so far been spent on the actual implementation of algorithms that are based on color-coding, despite the elegance of this technique and its wide range of applicability to practically important problems. This work gives various novel algorithmic improvements for color-coding, both from a worst-case perspective as well as under practical considerations. We apply the resulting implementation to the identification of signaling pathways in protein interaction networks, demonstrating that our improvements speed up the color-coding algorithm by orders of magnitude over previous implementations. This allows more complex and larger structures to be identified in reasonable time; many biologically relevant instances can even be solved in seconds where, previously, hours were required.
The Computer Journal | 2007
Falk Hüffner; Rolf Niedermeier; Sebastian Wernicke
The fixed-parameter approach is an algorithm design technique for solving combinatorially hard (mostly NP-hard) problems. For some of these problems, it can lead to algorithms that are both efficient and yet at the same time guaranteed to find optimal solutions. Focusing on their application to solving NP-hard problems in practice, we survey three main techniques to develop fixed-parameter algorithms, namely: kernelization (data reduction with provable performance guarantee), depthbounded search trees and a new technique called iterative compression. Our discussion is circumstantiated by several concrete case studies and provides pointers to various current challenges in the field.