Neural Computing and Applications | 2019

Feature selection with MCP$$^2$$2 regularization

 
 
 

Abstract


Feature selection, as a fundamental component of building robust models, plays an important role in many machine learning and data mining tasks. Recently, with the development of sparsity research, both theoretical and empirical studies have suggested that the sparsity is one of the intrinsic properties of real world data and sparsity regularization has been applied into feature selection models successfully. In view of the remarkable performance of non-convex regularization, in this paper, we propose a novel non-convex yet Lipschitz continuous sparsity regularization term, named MCP$$^2$$2, and apply it into feature selection. To solve the resulting non-convex model, a new algorithm in the framework of the ConCave–Convex Procedure is given at the same time. Experimental results on benchmark datasets demonstrate the effectiveness of the proposed method.

Volume 31
Pages 6699-6709
DOI 10.1007/S00521-018-3500-7
Language English
Journal Neural Computing and Applications

Full Text