Morgan Chopin
Paris Dauphine University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Morgan Chopin.
Journal of Computer and System Sciences | 2014
Cristina Bazgan; Morgan Chopin; Marek Cygan; Michael R. Fellows; Fedor V. Fomin; Erik Jan van Leeuwen
The Firefighter problem is to place firefighters on the vertices of a graph to prevent a fire with known starting point from lighting up the entire graph. In each time step, a firefighter may be placed on an unburned vertex, permanently protecting it, and the fire spreads to all neighboring unprotected vertices of burning vertices. The goal is to let as few vertices burn as possible. In this paper, we consider a generalization of this problem, where at each time step b⩾1b⩾1 firefighters can be deployed. Our results answer several open questions raised by Cai et al. [8]. We show that this problem is W[1]-hard when parameterized by the number of saved vertices, protected vertices, and burned vertices. We also investigate several combined parameterizations for which the problem is fixed-parameter tractable. Some of our algorithms improve on previously known algorithms. We also establish lower bounds to polynomial kernelization.
international symposium on algorithms and computation | 2011
Cristina Bazgan; Morgan Chopin; Michael R. Fellows
In this paper we study the parameterized complexity of the firefighter problem. More precisely, we show that Savingk-Vertices and its dual Saving All Butk-Vertices are both W[1]-hard for parameter k even for bipartite graphs. We also investigate several cases for which the firefighter problem is tractable. For instance, Savingk-Vertices is fixed-parameter tractable on planar graphs for parameter k. Moreover, we prove a lower bound to polynomial kernelization for Saving All Butk-Vertices.
international conference on algorithms and complexity | 2013
René van Bevern; Robert Bredereck; Morgan Chopin; Sepp Hartung; Falk Hüffner; André Nichterlein; Ondřej Suchý
The goal of tracking the origin of short, distinctive phrases (memes) that propagate through the web in reaction to current events has been formalized as DAG Partitioning: given a directed acyclic graph, delete edges of minimum weight such that each resulting connected component of the underlying undirected graph contains only one sink. Motivated by NP-hardness and hardness of approximation results, we consider the parameterized complexity of this problem. We show that it can be solved in O(2 k ·n 2) time, where k is the number of edge deletions, proving fixed-parameter tractability for parameter k. We then show that unless the Exponential Time Hypothesis (ETH) fails, this cannot be improved to 2 o(k) ·n O(1); further, DAG Partitioning does not have a polynomial kernel unless NP ⊆ coNP/poly. Finally, given a tree decomposition of width w, we show how to solve DAG Partitioning in \(2^{O(w^2)}\cdot n\) time, improving a known algorithm for the parameter pathwidth.
Journal of Computer and System Sciences | 2016
Faisal N. Abu-Khzam; Cristina Bazgan; Morgan Chopin; Henning Fernau
We introduce data reduction rules for obtaining approximation algorithms for maximization problems.These rules are combined with certain combinatorial insights often typical for kernelization algorithms.The resulting algorithms are of a combinatorial nature, yet better than previous published work in at least two special cases.We discuss several concrete problems that are natural maximization versions of well-studied domination-type graph problems. Kernelization algorithms in the context of Parameterized Complexity are often based on a combination of data reduction rules and combinatorial insights. We will expose in this paper a similar strategy for obtaining polynomial-time approximation algorithms. Our method features the use of approximation-preserving reductions, akin to the notion of parameterized reductions. We exemplify this method to obtain the currently best approximation algorithms for Harmless Set, Differential and Multiple Nonblocker, all of them can be considered in the context of securing networks or information propagation.
computing and combinatorics conference | 2013
Cristina Bazgan; Morgan Chopin; André Nichterlein; Florian Sikora
In this paper, we consider the problem of maximizing the spread of influence through a social network. Here, we are given a graph G = (V,E), a positive integer k and a threshold value thr(v) attached to each vertex v ∈ V. The objective is then to find a subset of k vertices to “activate” such that the number of activated vertices at the end of a propagation process is maximum. A vertex v gets activated if at least thr(v) of its neighbors are. We show that this problem is strongly inapproximable in fpt-time with respect to (w.r.t.) parameter k even for very restrictive thresholds. For unanimity thresholds, we prove that the problem is inapproximable in polynomial time and the decision version is W[1]-hard w.r.t. parameter k. On the positive side, it becomes r(n)-approximable in fpt-time w.r.t. parameter k for any strictly increasing function r. Moreover, we give an fpt-time algorithm to solve the decision version for bounded degree graphs.
mathematical foundations of computer science | 2012
Cristina Bazgan; Morgan Chopin
In this paper, we introduce the Robust Set problem: given a graph G=(V,E), a threshold function t:V→N and an integer k, find a subset of vertices V′⊆V of size at least k such that every vertex v in G has less than t(v) neighbors in V′. This problem occurs in the context of the spread of undesirable agents through a network (virus, ideas, fire, …). Informally speaking, the problem asks to find the largest subset of vertices with the property that if anything bad happens in it then this will have no consequences on the remaining graph. The threshold t(v) of a vertex v represents its reliability regarding its neighborhood; that is, how many neighbors can be infected before v gets himself infected. We study in this paper the parameterized complexity of Robust Set and the approximation of the associated maximization problem. When the parameter is k, we show that this problem is W[2]-complete in general and W[1]-complete if all thresholds are constant bounded. Moreover, we prove that, if P≠NP, the maximization version is not n1−e- approximable for any e>0 even when all thresholds are at most two. When each threshold is equal to the degree of the vertex, we show that k-Robust Set is fixed-parameter tractable for parameter k and the maximization version is APX-complete. We give a polynomial-time algorithm for graphs of bounded treewidth and a PTAS for planar graphs. Finally, we show that the parametric dual problem (n−k)-Robust Set is fixed-parameter tractable for a large family of threshold functions.
conference on computability in europe | 2014
Cristina Bazgan; Morgan Chopin; André Nichterlein; Florian Sikora
In this paper, we consider the Target Set Selection problem: given a graph and a threshold value Open image in new window for each vertex v of the graph, find a minimum size vertex-subset to “activate” s.t. all the vertices of the graph are activated at the end of the propagation process. A vertex v is activated during the propagation process if at least Open image in new window of its neighbors are activated. This problem models several practical issues like faults in distributed networks or word-to-mouth recommendations in social networks. We show that for any functions f and ρ this problem cannot be approximated within a factor of ρ(k) in f(k) ·n O(1) time, unless FPT = W[P], even for restricted thresholds (namely constant and majority thresholds). We also study the cardinality constraint maximization and minimization versions of the problem for which we prove similar hardness results.
international symposium on algorithms and computation | 2014
Faisal N. Abu-Khzam; Cristina Bazgan; Morgan Chopin; Henning Fernau
Kernelization algorithms in the context of Parameterized Complexity are often based on a combination of reduction rules and combinatorial insights. We will expose in this paper a similar strategy for obtaining polynomial-time approximation algorithms. Our method features the use of approximation-preserving reductions, akin to the notion of parameterized reductions. We exemplify this method to obtain the currently best approximation algorithms for Harmless Set, Differential and Multiple Nonblocker, all of them can be considered in the context of securing networks or information propagation.
Theoretical Computer Science | 2017
Janka Chlebíková; Morgan Chopin
Abstract We consider the complexity of the firefighter problem where a budget of b ≥ 1 firefighters are available at each time step. This problem is known to be NP-complete even on trees of degree at most three and b = 1 [14] and on trees of bounded degree ( b + 3 ) for any fixed b ≥ 2 [4] . In this paper we provide further insight into the complexity landscape of the problem by showing a complexity dichotomy result with respect to the parameters pathwidth and maximum degree of the input graph. More precisely, first, we prove that the problem is NP-complete even on trees of pathwidth at most three for any b ≥ 1 . Then we show that the problem turns out to be fixed parameter-tractable with respect to the combined parameter “pathwidth” and “maximum degree” of the input graph. Finally, we show that the problem remains NP-complete on very dense graphs, namely co-bipartite graphs, but is fixed-parameter tractable with respect to the parameter “cluster vertex deletion”.
Discrete Applied Mathematics | 2017
Ren van Bevern; Robert Bredereck; Morgan Chopin; Sepp Hartung; Falk Hffner; Andr Nichterlein; Ondej Such
Finding the origin of short phrases propagating through the web has been formalized by Leskovec etal. (2009) as DAG Partitioning: given an arc-weighted directed acyclic graph on n vertices and m arcs, delete arcs with total weight at most k such that each resulting weakly-connected component contains exactly one sinkavertex without outgoing arcs. DAG Partitioning is NP-hard.We show an algorithm to solve DAG Partitioning in O(2k(n+m)) time, that is, in linear time for fixed k. We complement it with linear-time executable data reduction rules. Our experiments show that, in combination, they can optimally solve DAG Partitioning on simulated citation networks within five minutes for k190 and m being 107 and larger. We use our obtained optimal solutions to evaluate the solution quality of Leskovec etal.s heuristic.We show that Leskovec etal.s heuristic works optimally on trees and generalize this result by showing that DAG Partitioning is solvable in 2O(t2)n time if a width-t tree decomposition of the input graph is given. Thus, we improve an algorithm and answer an open question of Alamdari and Mehrabian (2012).We complement our algorithms by lower bounds on the running time of exact algorithms and on the effectivity of data reduction.