Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

Mathematical Optimization for Data Science Group

Prof. Dr. Peter Ochs


Peter Ochs

Cindy Ernst
(Secretary)

Camille Castera

Xiaoxi Jia

Sheheryar Mehmood

Shida Wang

Tejas Natu

Michael Sucker

Rodrigo Maulen

News:
04.2024: Our research project TRAGO will be funded by the German Research Foundation.

There are several open positions !


04.2024: The research topics Learning to Optimize and Dynamical System View on Optimization were added to our research page.
04.2024: We have a new preprint Learning-to-Optimize with PAC-Bayesian Guarantees: Theoretical Considerations and Practical Implementation on arXiv.
03.2024: We have a new preprint Stochastic Inertial Dynamics Via Time Scaling and Averaging on arXiv.
01.2024: Our paper Continuous Newton-like Methods featuring Inertia and Variable Mass is now published in the SIAM Journal on Optimization (SIOPT).
12.2023: We have a new preprint Accelerated Gradient Dynamics on Riemannian Manifolds: Faster Rate and Trajectory Convergence on arXiv.
11.2023: We have a new preprint Near-optimal Closed-loop Method via Lyapunov Damping for Convex Optimization on arXiv.
Xiaoxi Jia has joined our group in November 2023 as a PostDoc.
03.2023: New group webpage is online.
03.2023: One paper was accepted at SSVM 2023.
02.2023: Our paper PAC-Bayesian Learning of Optimization Algorithms is accepted at AISTATS 2023.
01.2023: We have a new preprint Continuous Newton-like Methods featuring Inertia and Variable Mass on arXiv.
News (before 03.2023 at University of Tübingen):
10.2022: We have a new preprint PAC-Bayesian Learning of Optimization Algorithms on arXiv.
09.2022: We have a new preprint Inertial Quasi-Newton Methods for Monotone Inclusion: Efficient Resolvent Calculus and Primal-Dual Methods on arXiv.
09.2022: We have a new preprint Fixed-Point Automatic Differentiation of Forward--Backward Splitting Algorithms for Partly Smooth Functions on arXiv.
Michael Sucker is going to join our group in April 2022 as a PhD Student.
Tejas Natu is going to join our group in March 2022 as a PhD Student.
Camille Castera is going to join our group in February 2022 as a PostDoc.
12.2021: Our paper Global Convergence of Model Function Based Bregman Proximal Minimization Algorithms has been accepted for publication in the Journal of Global Optimization.
Peter Ochs was invited for a presentation in the One World Optimization Seminar on 22.11.2021.
16.11.2021: Congratulations to Mahesh Chandra Mukkamala for his graduation!
08.2021: We have a new preprint An Abstract Convergence Framework with Application to Inertial Inexact Forward-Backward Methods on Optimization Online.
06.2021: Jan-Hendrik Lange leaves the group and joins Amazon EU.
02.2021: Shida Wang has joined our group.
04.2021: Oskar Adolfson has joined our group.
02.2021: Shida Wang has joined our group.
Our paper Differentiating the Value Function by using Convex Duality is accepted at AISTATS 2021.
12.2020: We have a new preprint Global Convergence of Model Function Based Bregman Proximal Minimization Algorithms on arXiv.
09.2020: Jan-Hendrik Lange has joined our group.
The Mathematical Optimization Group has moved to the University of Tübingen in September 2020.
Our paper "Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Non-Convex Optimization" by Mahesh Chandra Mukkamala, Peter Ochs, Thomas Pock and Shoham Sabach is now published in SIAM Journal on Mathematics of Data Science.


Alumni:

Jan-Hendrik Lange

(09.2020 - 05.2021)
went to Amazon EU

Mahesh Chandra
Mukkamala

(06.2018 - 05.2021)
Graduation 16.11.2021

Oskar Adolfson

(05.2021 - 04.2022)

MOP Group
©2017-2024
The author is not
responsible for
the content of
external pages.