Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

Automatic Differentiation of Some First-Order Methods in Parametric Optimization

S. Mehmood and P. Ochs

Abstract:
We aim at computing the derivative of the solution to a parametric optimization problem with respect to the involved parameters.For a class broader than that of strongly convex functions, this can be achieved by automatic differentiation of iterative minimization algorithms. If the iterative algorithm converges pointwise, then we prove that the derivative sequence also converges pointwise to the derivative of the minimizer with respect to the parameters. Moreover, we provide convergence rates for both sequences. In particular, we prove that the accelerated convergence rate of the Heavy-ball method compared to Gradient Descent also accelerates the derivative computation. An experiment with L2-Regularized Logistic Regression validates the theoretical results.
pdf Bibtex arXiv
Latest update: 13.10.2019
Citation:
S. Mehmood, P. Ochs:
Automatic Differentiation of Some First-Order Methods in Parametric Optimization. [pdf]
International Conference on Artificial Intelligence and Statistics, 2020.
Bibtex:
@inproceedings{MO20,
  title        = {Automatic Differentiation of Some First-Order Methods in Parametric Optimization},
  author       = {S. Mehmood and P. Ochs},
  year         = {2020},
  booktitle    = {International Conference on Artificial Intelligence and Statistics},
}


MOP Group
©2017-2024
The author is not
responsible for
the content of
external pages.