Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

Adaptive Fista for Non-convex Optimization

P. Ochs and T. Pock

Abstract:
In this paper we propose an adaptively extrapolated proximal gradient method, which is based on the accelerated proximal gradient method (also known as FISTA), however we locally optimize the extrapolation parameter by carrying out an exact (or inexact) line search. It turns out that in some situations, the proposed algorithm is equivalent to a class of SR1 (identity minus rank 1) proximal quasi-Newton methods. Convergence is proved in a general non-convex setting, and hence, as a byproduct, we also obtain new convergence guarantees for proximal quasi-Newton methods. The efficiency of the new method is shown in numerical experiments on a sparsity regularized non-linear inverse problem.
pdf Bibtex Publisher's link arXiv
Latest update: 30.06.2019
Citation:
P. Ochs, T. Pock:
Adaptive Fista for Non-convex Optimization. [pdf]
SIAM Journal on Optimization, 29(4):2482-2503, 2019.
Bibtex:
@article{OP19,
  title        = {Adaptive Fista for Non-convex Optimization},
  author       = {P. Ochs and T. Pock},
  year         = {2019},
  journal      = {SIAM Journal on Optimization},
  number       = {4},
  volume       = {29},
  pages        = {2482--2503}
}


MOP Group
©2017-2024
The author is not
responsible for
the content of
external pages.