Journal of Global Optimization | 2021

Resolving learning rates adaptively by locating stochastic non-negative associated gradient projection points using line searches

 
 

Abstract


Learning rates in stochastic neural network training are currently determined a priori to training, using expensive manual or automated iterative tuning. Attempts to resolve learning rates adaptively, using line searches, have proven computationally demanding. Reducing the computational cost by considering mini-batch sub-sampling (MBSS) introduces challenges due to significant variance in information between batches that may present as discontinuities in the loss function, depending on the MBSS approach. This study proposes a robust approach to adaptively resolve learning rates in dynamic MBSS loss functions. This is achieved by finding sign changes from negative to positive along directional derivatives, which ultimately converge to a stochastic non-negative associated gradient projection point. Through a number of investigative studies, we demonstrate that gradient-only line searches (GOLS) resolve learning rates adaptively, improving convergence performance over minimization line searches, ignoring certain local minima and eliminating an otherwise expensive hyperparameter. We also show that poor search directions may benefit computationally from overstepping optima along a descent direction, which can be resolved by considering improved search directions. Having shown that GOLS is a reliable line search allows for comparative investigations between static and dynamic MBSS.

Volume 79
Pages 111-152
DOI 10.1007/s10898-020-00921-z
Language English
Journal Journal of Global Optimization

Full Text