Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

Inertial Quasi-Newton Methods for Monotone Inclusion: Efficient Resolvent Calculus and Primal-Dual Methods

S. Wang, J. Fadili and P. Ochs

Abstract:
We introduce an inertial quasi-Newton Forward-Backward Splitting Algorithm to solve a class of monotone inclusion problems. While the inertial step is computationally cheap, in general, the bottleneck is the evaluation of the resolvent operator. A change of the metric makes its computation hard even for (otherwise in the standard metric) simple operators. In order to fully exploit the advantage of adapting the metric, we develop a new efficient resolvent calculus for a low-rank perturbed standard metric, which accounts exactly for quasi-Newton metrics. Moreover, we prove the convergence of our algorithms, including linear convergence rates in case one of the two considered operators is strongly monotone. Beyond the general monotone inclusion setup, we instantiate a novel inertial quasi-Newton Primal-Dual Hybrid Gradient Method for solving saddle point problems. The favourable performance of our inertial quasi-Newton PDHG method is demonstrated on several numerical experiments in image processing.
pdf Bibtex arXiv
Latest update: 28.09.2022
Citation:
S. Wang, J. Fadili, P. Ochs:
Inertial Quasi-Newton Methods for Monotone Inclusion: Efficient Resolvent Calculus and Primal-Dual Methods. [pdf]
Technical Report, ArXiv e-prints, arXiv:2209.14019, 2022.
Bibtex:
@techreport{WFO22,
  title        = {Inertial Quasi-Newton Methods for Monotone Inclusion: Efficient Resolvent Calculus and Primal-Dual Methods},
  author       = {S. Wang and J. Fadili and P. Ochs},
  year         = {2022},
  journal      = {ArXiv e-prints, arXiv:2209.14019},
}


MOP Group
©2017-2024
The author is not
responsible for
the content of
external pages.