Fast alternating direction optimization methods

Tom Goldstein, Brendan O’Donoghue, Simon Setzep, Richard Baraniuk

Research output: Contribution to journalArticlepeer-review

516 Scopus citations

Abstract

Alternating direction methods are a common tool for general mathematical programming and optimization. These methods have become p articularly important in the field of variational image processing, which frequently requires the minimization of nondifferentiable objectives. This paper considers accelerated (i.e., fast) variants of two common alternating direction methods: the alternating direction method of multipliers (ADMM) and the alternating minimization algorithm (AMA). The proposed acceleration is of the form first proposed by Nesterov for gradient descent methods. In the case that the objective function is strongly convex, global convergence bounds are provided for both classical and accelerated variants of the methods. Numerical examples are presented to demonstrate the superior performance of the fast methods for a wide variety of problems.

Original languageEnglish (US)
Pages (from-to)1588-1623
Number of pages36
JournalSIAM Journal on Imaging Sciences
Volume7
Issue number3
DOIs
StatePublished - Aug 5 2014

Keywords

  • Accelerated
  • ADMM
  • Bregman
  • Method of multipliers
  • Nesterov
  • Optimization
  • Splitting

ASJC Scopus subject areas

  • Applied Mathematics
  • Mathematics(all)

Fingerprint

Dive into the research topics of 'Fast alternating direction optimization methods'. Together they form a unique fingerprint.

Cite this