Abstract:
In this paper, we study convergence rates of the cubic regularized proximal quasi-Newton method for solving non-smooth additive composite problems that satisfy the so-called Kurdyka-Lojasiewicz (KL) property with respect to some desingularization function rather than strong convexity. After a number of iterations, Cubic SR1 PQN exhibits non-asymptotic explicit super-linear convergence rates. For functions which satisfy Łojasiewicz inequality, the rate becomes global and non-asymptotic. This work presents, for the first time, non-asymptotic explicit convergence rates of regularized (proximal) SR1 quasi-Newton methods applied to non-convex non-smooth problems with KŁ property. Actually, the rates are novel even in the smooth non-convex case. Notably, we achieve this without employing line search or trust region strategies, without assuming the Dennis-Moré condition, without any assumptions on quasi-Newton metrics and without assuming strong convexity. Furthermore, for convex problems, we focus on a more tractable gradient regularized quasi-Newton method (Grad SR1 PQN) which can achieve results similar to those obtained with cubic regularization. We also demonstrate, for the first time, the non-asymptotic super-linear convergence rate of Grad SR1 PQN for solving convex problems with the help of the Łojasiewicz inequality instead of strong convexity.
Bibtex: @techreport{WFO25b,
title = {Convergence rates of regularized quasi-Newton methods without strong convexity},
author = {S. Wang and J. Fadili and P. Ochs},
year = {2025},
journal = {ArXiv e-prints, arXiv:2506.00521},
}