Mathematical Programming | 2021

Adaptive Sampling line search for local stochastic optimization with integer variables

 
 
 
 

Abstract


We consider optimization problems with an objective function that is estimable using a Monte Carlo oracle, constraint functions that are known deterministically through a constraint-satisfaction oracle, and integer decision variables. Seeking an appropriately defined local minimum, we propose an iterative adaptive sampling algorithm that, during each iteration, performs a statistical local optimality test followed by a line search when the test detects a stochastic descent direction. We prove a number of results. First, the true function values at the iterates generated by the algorithm form an almost-supermartingale process, and the iterates are absorbed with probability one into the set of local minima in finite time. Second, such absorption happens exponentially fast in iteration number and in oracle calls. This result is analogous to non-standard rate guarantees in stochastic continuous optimization contexts that involve sharp minima. Third, the oracle complexity of the proposed algorithm increases linearly in the dimensionality of the local neighborhood. As a solver, primarily due to combining line searches that use common random numbers with statistical tests for local optimality, the proposed algorithm is effective on a variety of problems. We illustrate such performance using three problem suites, on problems ranging from 25 to 200 dimensions.

Volume None
Pages None
DOI 10.1007/S10107-021-01667-6
Language English
Journal Mathematical Programming

Full Text