Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

Global non-asymptotic super-linear convergence rates of regularized proximal quasi-Newton methods on non-smooth composite problems

S. Wang, J. Fadili and P. Ochs

Abstract:
In this paper, we propose two regularized proximal quasi-Newton methods with symmetric rank-1 update of the metric (SR1 quasi-Newton) to solve non-smooth convex additive composite problems. Both algorithms avoid using line search or other trust region strategies. For each of them, we prove a super-linear convergence rate that is independent of the initialization of the algorithm. The cubic regularized method achieves a rate of order (C/sqrt(N))^(N/2), where N is the number of iterations and C is some constant, and the other gradient regularized method shows a rate of the order (C/(N^(1/4))^(N/2). To the best of our knowledge, these are the first global non-asymptotic super-linear convergence rates for regularized quasi-Newton methods and regularized proximal quasi-Newton methods. The theoretical properties are also demonstrated in two applications from machine learning.
pdf Bibtex arXiv
Latest update: 15.10.2024
Citation:
S. Wang, J. Fadili, P. Ochs:
Global non-asymptotic super-linear convergence rates of regularized proximal quasi-Newton methods on non-smooth composite problems. [pdf]
Technical Report, ArXiv e-prints, arXiv:2410.11676, 2024.
Bibtex:
@techreport{WFO24,
  title        = {Global non-asymptotic super-linear convergence rates of regularized proximal quasi-Newton methods on non-smooth composite problems},
  author       = {S. Wang and J. Fadili and P. Ochs},
  year         = {2024},
  journal      = {ArXiv e-prints, arXiv:2410.11676},
}


MOP Group
©2017-2024
The author is not
responsible for
the content of
external pages.