Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hoai Minh Le is active.

Publication


Featured researches published by Hoai Minh Le.


Advanced Data Analysis and Classification | 2008

A DC programming approach for feature selection in support vector machines learning

Hoai An Le Thi; Hoai Minh Le; Van Vinh Nguyen; Tao Pham Dinh

Feature selection consists of choosing a subset of available features that capture the relevant properties of the data. In supervised pattern classification, a good choice of features is fundamental for building compact and accurate classifiers. In this paper, we develop an efficient feature selection method using the zero-norm l0 in the context of support vector machines (SVMs). Discontinuity at the origin for l0 makes the solution of the corresponding optimization problem difficult to solve. To overcome this drawback, we use a robust DC (difference of convex functions) programming approach which is a general framework for non-convex continuous optimisation. We consider an appropriate continuous approximation to l0 such that the resulting problem can be formulated as a DC program. Our DC algorithm (DCA) has a finite convergence and requires solving one linear program at each iteration. Computational experiments on standard datasets including challenging feature-selection problems of the NIPS 2003 feature selection challenge and gene selection for cancer classification show that the proposed method is promising: while it suppresses up to more than 99% of the features, it can provide a good classification. Moreover, the comparative results illustrate the superiority of the proposed approach over standard methods such as classical SVMs and feature selection concave.


European Journal of Operational Research | 2015

DC approximation approaches for sparse optimization

H.A. Le Thi; T. Pham Dinh; Hoai Minh Le; Xuan Thanh Vo

Sparse optimization refers to an optimization problem involving the zero-norm in objective or constraints. In this paper, nonconvex approximation approaches for sparse optimization have been studied with a unifying point of view in DC (Difference of Convex functions) programming framework. Considering a common DC approximation of the zero-norm including all standard sparse inducing penalty functions, we studied the consistency between global minimums (resp. local minimums) of approximate and original problems. We showed that, in several cases, some global minimizers (resp. local minimizers) of the approximate problem are also those of the original problem. Using exact penalty techniques in DC programming, we proved stronger results for some particular approximations, namely, the approximate problem, with suitable parameters, is equivalent to the original problem. The efficiency of several sparse inducing penalty functions have been fully analyzed. Four DCA (DC Algorithm) schemes were developed that cover all standard algorithms in nonconvex sparse approximation approaches as special versions. They can be viewed as, an


Machine Learning | 2015

Feature selection in machine learning: an exact penalty approach using a Difference of Convex function Algorithm

Hoai An Le Thi; Hoai Minh Le; Tao Pham Dinh

\ell _{1}


Journal of Global Optimization | 2013

Binary classification via spherical separator by DC programming and DCA

Hoai An Le Thi; Hoai Minh Le; Tao Pham Dinh; Ngai Van Huynh

-perturbed algorithm / reweighted-


Advanced Data Analysis and Classification | 2007

Fuzzy clustering based on nonconvex optimisation approaches using difference of convex (DC) functions algorithms

Hoai An Le Thi; Hoai Minh Le; Tao Pham Dinh

\ell _{1}


Neurocomputing | 2015

Sparse semi-supervised support vector machines by DC programming and DCA

Hoai Minh Le; Hoai An Le Thi; Manh Cuong Nguyen

algorithm / reweighted-


IEEE Transactions on Automatic Control | 2014

A Difference of Convex Functions Algorithm for Switched Linear Regression

Tao Pham Dinh; Hoai Minh Le; Hoai An Le Thi; Fabien Lauer

\ell _{1}


Neural Computation | 2013

Block clustering based on difference of convex functions dc programming and dc algorithms

Hoai Minh Le; Hoai An Le Thi; Tao Pham Dinh; Van Ngai Huynh

algorithm. We offer a unifying nonconvex approximation approach, with solid theoretical tools as well as efficient algorithms based on DC programming and DCA, to tackle the zero-norm and sparse optimization. As an application, we implemented our methods for the feature selection in SVM (Support Vector Machine) problem and performed empirical comparative numerical experiments on the proposed algorithms with various approximation functions.


asian conference on intelligent information and database systems | 2013

Sparse signal recovery by difference of convex functions algorithms

Hoai An Le Thi; Bich Thuy Nguyen Thi; Hoai Minh Le

We develop an exact penalty approach for feature selection in machine learning via the zero-norm


Journal of Global Optimization | 2010

A combined DCA: GA for constructing highly nonlinear balanced boolean functions in cryptography

Hoai Minh Le; Hoai An Le Thi; Tao Pham Dinh; Pascal Bouvry

Collaboration


Dive into the Hoai Minh Le's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bach Tran

University of Lorraine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ngai Van Huynh

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

H.A. Le Thi

University of Lorraine

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge