Abstract:
In this paper, we propose two regularized proximal quasi-Newton methods with symmetric rank-1 update of the metric (SR1 quasi-Newton) to solve non-smooth convex additive composite problems. Both algorithms avoid using line search or other trust region strategies. For each of them, we prove a super-linear convergence rate that is independent of the initialization of the algorithm. The cubic regularized method achieves a rate of order (C/sqrt(N))^(N/2), where N is the number of iterations and C is some constant, and the other gradient regularized method shows a rate of the order (C/(N^(1/4))^(N/2). To the best of our knowledge, these are the first global non-asymptotic super-linear convergence rates for regularized quasi-Newton methods and regularized proximal quasi-Newton methods. The theoretical properties are also demonstrated in two applications from machine learning.
Bibtex: @techreport{WFO24,
title = {Global non-asymptotic super-linear convergence rates of regularized proximal quasi-Newton methods on non-smooth composite problems},
author = {S. Wang and J. Fadili and P. Ochs},
year = {2024},
journal = {ArXiv e-prints, arXiv:2410.11676},
}