Discrete Dynamics in Nature and Society | 2019

General Six-Step Discrete-Time Zhang Neural Network for Time-Varying Tensor Absolute Value Equations

 
 

Abstract


This article presents a general six-step discrete-time Zhang neural network (ZNN) for time-varying tensor absolute value equations. Firstly, based on the Taylor expansion theory, we derive a general Zhang et al. discretization (ZeaD) formula, i.e., a general Taylor-type 1-step-ahead numerical differentiation rule for the first-order derivative approximation, which contains two free parameters. Based on the bilinear transform and the Routh–Hurwitz stability criterion, the effective domain of the two free parameters is analyzed, which can ensure the convergence of the general ZeaD formula. Secondly, based on the general ZeaD formula, we design a general six-step discrete-time ZNN (DTZNN) for time-varying tensor absolute value equations (TVTAVEs), whose steady-state residual error changes in a higher order manner than those presented in the literature. Meanwhile, the feasible region of its step size, which determines its convergence, is also studied. Finally, experiment results corroborate that the general six-step DTZNN model is quite efficient for TVTAVE solving.

Volume 2019
Pages 1-12
DOI 10.1155/2019/4861912
Language English
Journal Discrete Dynamics in Nature and Society

Full Text