IEEE Transactions on Systems, Man, and Cybernetics: Systems | 2021

A Fast Non-Negative Latent Factor Model Based on Generalized Momentum Method

 
 
 
 
 

Abstract


Non-negative latent factor (NLF) models can efficiently acquire useful knowledge from high-dimensional and sparse (HiDS) matrices filled with non-negative data. Single latent factor-dependent, non-negative and multiplicative update (SLF-NMU) is an efficient algorithm for building an NLF model on an HiDS matrix, yet it suffers slow convergence. A momentum method is frequently adopted to accelerate a learning algorithm, but it is incompatible with those implicitly adopting gradients like SLF-NMU. To build a fast NLF (FNLF) model, we propose a generalized momentum method compatible with SLF-NMU. With it, we further propose a single latent factor-dependent non-negative, multiplicative and momentum-incorporated update algorithm, thereby achieving an FNLF model. Empirical studies on six HiDS matrices from industrial application indicate that an FNLF model outperforms an NLF model in terms of both convergence rate and prediction accuracy for missing data. Hence, compared with an NLF model, an FNLF model is more practical in industrial applications.

Volume 51
Pages 610-620
DOI 10.1109/TSMC.2018.2875452
Language English
Journal IEEE Transactions on Systems, Man, and Cybernetics: Systems

Full Text