Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Fabien Lauer is active.

Publication


Featured researches published by Fabien Lauer.


IEEE Transactions on Automatic Control | 2014

A Difference of Convex Functions Algorithm for Switched Linear Regression

Tao Pham Dinh; Hoai Minh Le; Hoai An Le Thi; Fabien Lauer

This technical note deals with switched linear system identification and more particularly aims at solving switched linear regression problems in a large-scale setting with both numerous data and many parameters to learn. We consider the recent minimum-of-error framework with a quadratic loss function, in which an objective function based on a sum of minimum errors with respect to multiple submodels is to be minimized. The technical note proposes a new approach to the optimization of this nonsmooth and nonconvex objective function, which relies on Difference of Convex (DC) functions programming. In particular, we formulate a proper DC decomposition of the objective function, which allows us to derive a computationally efficient DC algorithm. Numerical experiments show that the method can efficiently and accurately learn switching models in large dimensions and from many data points.


Automatica | 2015

On the complexity of piecewise affine system identification

Fabien Lauer

The paper provides results regarding the computational complexity of hybrid system identification. More precisely, we focus on the estimation of piecewise affine (PWA) maps from input-output data and analyze the complexity of computing a global minimizer of the error. Previous work showed that a global solution could be obtained for continuous PWA maps with a worst-case complexity exponential in the number of data. In this paper, we show how global optimality can be reached for a slightly more general class of possibly discontinuous PWA maps with a complexity only polynomial in the number of data, however with an exponential complexity with respect to the data dimension. This result is obtained via an analysis of the intrinsic classification subproblem of associating the data points to the different modes. In addition, we prove that the problem is NP-hard, and thus that the exponential complexity in the dimension is a natural expectation for any exact algorithm.


international conference on hybrid systems computation and control | 2013

Learning nonlinear hybrid systems: from sparse optimization to support vector regression

Van Luong Le; Fabien Lauer; Laurent Bako; Gérard Bloch

This paper deals with the identification of hybrid systems switching between nonlinear subsystems of unknown structure and focuses on the connections with a family of machine learning algorithms known as support vector machines. In particular, we consider a recent approach to nonlinear hybrid system identification based on a convex relaxation of a sparse optimization problem. In this approach, the submodels are iteratively estimated one by one by maximizing the sparsity of the corresponding error vector. We extend this approach in several ways. First, we relax the sparsity condition by introducing robust sparsity, which can be optimized through the minimization of a modified l1-norm or, equivalently, of the ε-insensitive loss function. Then, we show that, depending on the choice of regularizer, the method is equivalent to different forms of support vector regression. More precisely, the submodels can be estimated by iteratively solving a classical support vector regression problem, in which the sparsity of support vectors relates to the sparsity of the error vector in the considered hybrid system identification framework. This allows us to extend theoretical results as well as efficient optimization algorithms from the field of machine learning to the hybrid system framework.


international workshop on machine learning for signal processing | 2012

Learning smooth models of nonsmooth functions via convex optimization

Fabien Lauer; Van Luong Le; Gérard Bloch

This paper proposes a learning framework and a set of algorithms for nonsmooth regression, i.e., for learning piecewise smooth target functions with discontinuities in the function itself or the derivatives at unknown locations. In the proposed approach, the model belongs to a class of smooth functions. Though constrained to be globally smooth, the trained model can have very large derivatives at particular locations to approximate the nonsmoothness of the target function. This is obtained through the definition of new regularization terms which penalize the derivatives in a location-dependent manner and training algorithms in the form of convex optimization problems. Examples of application to hybrid dynamical system identification and image reconstruction are provided.


Automatica | 2016

On the complexity of switching linear regression

Fabien Lauer

This technical note extends recent results on the computational complexity of globally minimizing the error of piecewise-affine models to the related problem of minimizing the error of switching linear regression models. In particular, we show that, on the one hand the problem is NP-hard, but on the other hand, it admits a polynomial-time algorithm with respect to the number of data points for any fixed data dimension and number of modes.


Journal of Global Optimization | 2015

Finding sparse solutions of systems of polynomial equations via group-sparsity optimization

Fabien Lauer; Henrik Ohlsson

The paper deals with the problem of finding sparse solutions to systems of polynomial equations possibly perturbed by noise. In particular, we show how these solutions can be recovered from group-sparse solutions of a derived system of linear equations. Then, two approaches are considered to find these group-sparse solutions. The first one is based on a convex relaxation resulting in a second-order cone programming formulation which can benefit from efficient reweighting techniques for sparsity enhancement. For this approach, sufficient conditions for the exact recovery of the sparsest solution to the polynomial system are derived in the noiseless setting, while stable recovery results are obtained for the noisy case. Though lacking a similar analysis, the second approach provides a more computationally efficient algorithm based on a greedy strategy adding the groups one-by-one. With respect to previous work, the proposed methods recover the sparsest solution in a very short computing time while remaining at least as accurate in terms of the probability of success. This probability is empirically analyzed to emphasize the relationship between the ability of the methods to solve the polynomial system and the sparsity of the solution.


conference on decision and control | 2014

Piecewise smooth system identification in reproducing kernel Hilbert space

Fabien Lauer; Gérard Bloch

The paper extends the recent approach of Ohlsson and Ljung for piecewise affine system identification to the nonlinear case while taking a clustering point of view. In this approach, the problem is cast as the minimization of a convex cost function implementing a trade-off between the fit to the data and a sparsity prior on the number of pieces. Here, we consider the nonlinear case of piecewise smooth system identification without prior knowledge on the type of nonlinearities involved. This is tackled by simultaneously learning a collection of local models from a reproducing kernel Hilbert space via the minimization of a convex functional, for which we prove a representer theorem that provides the explicit form of the solution. An example of application to piecewise smooth system identification shows that both the mode and the nonlinear local models can be accurately estimated.


Pattern Recognition Letters | 2018

On the exact minimization of saturated loss functions for robust regression and subspace estimation

Fabien Lauer

Abstract This paper deals with robust regression and subspace estimation and more precisely with the problem of minimizing a saturated loss function. In particular, we focus on computational complexity issues and show that an exact algorithm with polynomial time-complexity with respect to the number of data can be devised for robust regression and subspace estimation. This result is obtained by adopting a classification point of view and relating the problems to the search for a linear model that can approximate the maximal number of points with a given error. Approximate variants of the algorithms based on ramdom sampling are also discussed and experiments show that they offer an accuracy gain over the traditional RANSAC for a similar algorithmic simplicity.


International Orthopaedics | 2017

The horizontal plane appearances of scoliosis: what information can be obtained from top-view images?

Tamás S. Illés; Máté Burkus; Szabolcs Somoskeőy; Fabien Lauer; F. Lavaste; Jean Dubousset

PurposeA posterior-anterior vertebral vector is proposed to facilitate visualization and understanding of scoliosis. The aim of this study was to highlight the interest of using vertebral vectors, especially in the horizontal plane, in clinical practice.MethodsWe used an EOS two-/three-dimensional (2D/3D) system and its sterEOS 3D software for 3D reconstruction of 139 normal and 814 scoliotic spines—of which 95 cases were analyzed pre-operatively and post-operatively, as well. Vertebral vectors were generated for each case. Vertebral vectors have starting points in the middle of the interpedicular segment, while they are parallel to the upper plate, ending in the middle of the segment joining the anterior end plates points, thus defining the posterior-anterior axis of vertebrae. To illustrate what information could be obtained from vertebral vector-based top-view images, representative cases of a normal spine and a thoracic scoliosis are presented.ResultsFor a normal spine, vector projections in the transverse plane are aligned with the posterior-anterior anatomical axis. For a scoliotic spine, vector projections in the horizontal plane provide information on the lateral decompensation of the spine and the lateral displacement of vertebrae. In the horizontal plane view, vertebral rotation and projections of the sagittal curves can also be analyzed simultaneously.ConclusionsThe use of posterior-anterior vertebral vector facilitates the understanding of the 3D nature of scoliosis. The approach used is simple. These results are sufficient for a first visual analysis furnishing significant clinical information in all three anatomical planes. This visualization represents a reasonable compromise between mathematical purity and practical use.


pattern recognition in bioinformatics | 2012

Cascading discriminant and generative models for protein secondary structure prediction

Fabienne Thomarat; Fabien Lauer; Yann Guermeur

Most of the state-of-the-art methods for protein seconday structure prediction are complex combinations of discriminant models. They apply a local approach of the prediction which is known to induce a limit on the expected prediction accuracy. A priori, the use of generative models should make it possible to overcome this limitation. However, among the numerous hidden Markov models which have been dedicated to this task over more than two decades, none has come close to providing comparable performance. A major reason for this phenomenon is provided by the nature of the relevant information. Indeed, it is well known that irrespective of the model implemented, the prediction should benefit significantly from the availability of evolutionary information. Currently, this knowledge is embedded in position-specific scoring matrices which cannot be processed easily with hidden Markov models. With this observation at hand, the next significant advance should come from making the best of the two approaches, i.e., using a generative model on top of discriminant models. This article introduces the first hybrid architecture of this kind with state-of-the-art performance. The conjunction of the two levels of treatment makes it possible to optimize the recognition rate both at the residue level and at the segment level.

Collaboration


Dive into the Fabien Lauer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

F. Lavaste

Arts et Métiers ParisTech

View shared research outputs
Top Co-Authors

Avatar

Jean Dubousset

Arts et Métiers ParisTech

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tamás S. Illés

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Henrik Ohlsson

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge