Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

A Quasi-Newton Primal-Dual Algorithm with Line Search

S. Wang, J. Fadili and P. Ochs

Abstract:
Quasi-Newton methods refer to a class of algorithms at the interface between first and second order methods. They aim to progress as substantially as second order methods per iteration, while maintaining the computational complexity of first order methods. The approximation of second order information by first order derivatives can be expressed as adopting a variable metric, which for (limited memory) quasi-Newton methods is of type “identity ± low rank”. This paper continues the effort to make these powerful methods available for non-smooth systems occurring, for example, in large scale Machine Learning applications by exploiting this special structure. We develop a line search variant of a recently introduced quasi-Newton primal-dual algorithm, which adds significant flexibility, admits larger steps per iteration, and circumvents the complicated precalculation of a certain operator norm. We prove convergence, including convergence rates, for our proposed method and outperform related algorithms in a large scale image deblurring application.
pdf Bibtex Link
Citation:
S. Wang, J. Fadili, P. Ochs:
A Quasi-Newton Primal-Dual Algorithm with Line Search. [pdf]
International Conference on Scale Space and Variational Methods in Computer Vision (SSVM). Lecture Notes in Computer Science, Springer, 2023.
Bibtex:
@inproceedings{WFO23,
  title        = {A Quasi-Newton Primal-Dual Algorithm with Line Search},
  author       = {S. Wang and J. Fadili and P. Ochs},
  year         = {2023},
  booktitle    = {International Conference on Scale Space and Variational Methods in Computer Vision (SSVM)},
  series       = {Lecture Notes in Computer Science},
  publisher    = {Springer},
}


MOP Group
©2017-2024
The author is not
responsible for
the content of
external pages.