Neurocomputing | 2021

Fast training of deep LSTM networks with guaranteed stability for nonlinear system modeling

 
 
 

Abstract


Abstract Deep recurrent neural networks (RNN), such as LSTM, have many advantages over forward networks for nonlinear system modeling. However, the\xa0most used training method, backward propagation through time (BPTT), is very slow. In this paper, by separating the LSTM cell into forward and recurrent models, we give a faster training method than BPTT. The deep LSTM is modified by combining the deep RNN with the multilayer perceptrons (MLP). The backpropagation-like training methods are proposed for the deep RNN and MLP trainings. The stability of these algorithms are demonstrated. The simulation results show that our fast training methods for LSTM are better than the conventional approaches.

Volume 422
Pages 85-94
DOI 10.1016/J.NEUCOM.2020.09.030
Language English
Journal Neurocomputing

Full Text