The MathNet Korea
Information Center for Mathematical Science

PAC

Information Center for Mathematical Science

PAC

Adaptive Restart for Accelerated Gradient Schemes
Author Emmanuel Candes (Stanford University)
Homepage Url http://www-stat.stanford.edu/~candes/publications.html
Coauthors Brendan O’Donoghue
Abstract In this paper we demonstrate a simple heuristic adaptive restart technique that can dramatically improve the convergence rate of accelerated gradient schemes. The analysis of the technique relies on the observation that these schemes exhibit two modes of behavior depending on how much momentum is applied. In what we refer to as the ‘high momentum’ regime the iterates generated by an accelerated gradient scheme exhibit a periodic behavior, where the period is proportional to the square root of the local condition number of the objective function. This suggests a restart technique whereby we reset the momentum whenever we observe periodic behaviour. We provide analysis to show that in many cases adaptively restarting allows us to recover the optimal rate of convergence with no prior knowledge of function parameters.
Abstract Url http://www-stat.stanford.edu/~candes/papers/adap_restart_paper.pdf