Neural networks : the official journal of the International Neural Network Society | 2019

Fully complex conjugate gradient-based neural networks using Wirtinger calculus framework: Deterministic convergence and its application

 
 
 
 
 

Abstract


Conjugate gradient method has been verified to be one effective strategy for training neural networks due to its low memory requirements and fast convergence. In this paper, we propose an efficient conjugate gradient method to train fully complex-valued network models in terms of Wirtinger differential operator. Two ways are adopted to enhance the training performance. One is to construct a sufficient descent direction during training by designing a fine tuning conjugate coefficient. Another technique is to pursue the optimal learning rate instead of a fixed constant in each iteration which is determined by employing a generalized Armijo search. In addition, we rigorously prove its weak and strong convergence results, i.e., the gradient norms of objective function with respect to weights approach zero along with the increasing iterations and the weight sequence tends to the optimal point. To verify the effectiveness and rationality of the proposed method, four illustrated simulations have been performed on both typical regression and classification problems.

Volume 115
Pages \n 50-64\n
DOI 10.1016/j.neunet.2019.02.011
Language English
Journal Neural networks : the official journal of the International Neural Network Society

Full Text