Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent

Bao Wang, Tan Nguyen, Tao Sun, Andrea L. Bertozzi, Richard G. Baraniuk, Stanley J. Osher

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

Stochastic gradient descent (SGD) algorithms, with constant momentum and its variants such as Adam, are the optimization methods of choice for training deep neural networks (DNNs). There is great interest in speeding up the convergence of these methods due to their high computational expense. Nesterov accelerated gradient with a time-varying momentum (NAG) improves the convergence rate of gradient descent for convex optimization using a specially designed momentum; however, it accumulates error when the stochastic gradient is used, slowing convergence at best and diverging at worst. In this paper, we propose scheduled restart SGD (SRSGD), a new NAG-style scheme for training DNNs. SRSGD replaces the constant momentum in SGD by the increasing momentum in NAG but stabilizes the iterations by resetting the momentum to zero according to a schedule. Using a variety of models and benchmarks for image classification, we demonstrate that, in training DNNs, SRSGD significantly improves convergence and generalization; for instance, in training ResNet-200 for ImageNet classification, SRSGD achieves an error rate of 20.93% versus the benchmark of 22.13%. These improvements become more significant as the network grows deeper. Furthermore, on both CIFAR and ImageNet, SRSGD reaches similar or even better error rates with significantly fewer training epochs compared to the SGD baseline. Our implementation of SRSGD is available at https://github.com/minhtannguyen/SRSGD.

Original languageEnglish (US)
Pages (from-to)738-761
Number of pages24
JournalSIAM Journal on Imaging Sciences
Volume15
Issue number2
DOIs
StatePublished - 2022

Keywords

  • Nesterov accelerated gradient
  • deep learning
  • restart
  • stochastic optimization

ASJC Scopus subject areas

  • General Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent'. Together they form a unique fingerprint.

Cite this