J. Optim. Theory Appl. | 2019

An Infeasible Stochastic Approximation and Projection Algorithm for Stochastic Variational Inequalities

 
 
 
 

Abstract


In this paper, we consider a stochastic variational inequality, in which the mapping involved is an expectation of a given random function. Inspired by the work of He (Appl Math Optim 35:69–76, 1997) and the extragradient method proposed by Iusem et al. (SIAM J Optim 29:175–206, 2019), we propose an infeasible projection algorithm with line search scheme, which can be viewed as a modification of the above-mentioned method of Iusem et al. In particular, in the correction step, we replace the projection by computing search direction and stepsize, that is, we need only one projection at each iteration, while the method of Iusem et al. requires two projections at each iteration. Moreover, we use dynamic sampled scheme with line search to cope with the absence of Lipschitz constant and choose the stepsize to be bounded away from zero and the direction to be a descent direction. In the process of stochastic approximation, we iteratively reduce the variance of a stochastic error. Under appropriate assumptions, we derive some properties related to convergence, convergence rate, and oracle complexity. In particular, compared with the method of Iusem et al., our method uses less projections and has the same iteration complexity, which, however, has a higher oracle complexity for a given tolerance in a finite dimensional space. Finally, we report some numerical experiments to show its efficiency.

Volume 183
Pages 1053-1076
DOI 10.1007/S10957-019-01578-9
Language English
Journal J. Optim. Theory Appl.

Full Text