2021 11th International Conference on Information Science and Technology (ICIST) | 2021

An attempt of applying the Lagrange-type 1-step-ahead numerical differentiation method to optimize the SGD algorithm in deep learning

 
 
 
 
 
 

Abstract


The form of the original stochastic gradient descent (SGD) algorithm accords with the definition of forward Euler method, which has some inherent defects. Thus, in order to improve the original SGD algorithm, a Lagrange-type 1-step-ahead numerical differentiation method based parameter update algorithm is presented and validated. Instinctively, the new algorithm can fix some inherent flaws of the SGD algorithm. However, a series of experimental results show that the Lagrange-type 1-step-ahead numerical differentiation method cannot be applied to reduce the computational error of SGD. In addition, this method makes the model appear the phenomenon of non-convergence. Finally, on the basis of comparative experiments, the divergence phenomenon is analyzed and explained.

Volume None
Pages 485-490
DOI 10.1109/ICIST52614.2021.9440607
Language English
Journal 2021 11th International Conference on Information Science and Technology (ICIST)

Full Text