Cognitive Computation | 2019

Joint Sparse Regularization for Dictionary Learning

 
 
 
 
 
 

Abstract


As a powerful data representation framework, dictionary learning has emerged in many domains, including machine learning, signal processing, and statistics. Most existing dictionary learning methods use the ℓ0 or ℓ1 norm as regularization to promote sparsity, which neglects the redundant information in dictionary. In this paper, a class of joint sparse regularization is introduced to dictionary learning, leading to a compact dictionary. Unlike previous works which obtain sparse representations independently, we consider all representations in dictionary simultaneously. An efficient iterative solver based on ConCave-Convex Procedure (CCCP) framework and Lagrangian dual is developed to tackle the resulting model. Further, based on the dictionary learning with joint sparse regularization, we consider the multi-layer structure, which can extract the more abstract representation of data. Numerical experiments are conducted on several publicly available datasets. The experimental results demonstrate the effectiveness of joint sparse regularization for dictionary learning.

Volume None
Pages 1-14
DOI 10.1007/s12559-019-09650-2
Language English
Journal Cognitive Computation

Full Text