Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

Accelerated Gradient Dynamics on Riemannian Manifolds: Faster Rate and Trajectory Convergence

T. Natu, C. Castera, J. Fadili and P. Ochs

Abstract:
In order to minimize a differentiable geodesically convex function, we study a second-order dynamical system on Riemannian manifolds with an asymptotically vanishing damping term of the form alpha/t. For positive values of alpha, convergence rates for the objective values and convergence of trajectory is derived. We emphasize the crucial role of the curvature of the manifold for the distinction of the modes of convergence. There is a clear correspondence to the results that are known in the Euclidean case. When α is larger than a certain constant that depends on the curvature of the manifold, we improve the convergence rate of objective values compared to the previously known rate and prove the convergence of the trajectory of the dynamical system to an element of the set of minimizers. For α smaller than this curvature-dependent constant, the best known sub-optimal rates for the objective values and the trajectory are transferred to the Riemannian setting. We present computational experiments that corroborate our theoretical results.
pdf Bibtex arXiv
Latest update: 11.12.2023
Citation:
T. Natu, C. Castera, J. Fadili, P. Ochs:
Accelerated Gradient Dynamics on Riemannian Manifolds: Faster Rate and Trajectory Convergence. [pdf]
Technical Report, ArXiv e-prints, arXiv:2312.06366, 2023.
Bibtex:
@techreport{NCFO23,
  title        = {Accelerated Gradient Dynamics on Riemannian Manifolds: Faster Rate and Trajectory Convergence},
  author       = {T. Natu and C. Castera and J. Fadili and P. Ochs},
  year         = {2023},
  journal      = {ArXiv e-prints, arXiv:2312.06366},
}


MOP Group
©2017-2024
The author is not
responsible for
the content of
external pages.