Tutorials:
by arrangement.
Core Lecture for Mathematics and Computer Science
Language: English
Prerequisites: Basics of Mathematics
(e.g. Linear Algebra 1-2, Analysis 1-3, Mathematics 1-3 for Computer Science)
Optimization methods or algorithms seek for a state of a system that is optimal with respect to an objective function. Depending on the properties of the objective function, different strategies may be used to find such an optimal state (or point). The fundamental knowledge of the classes of functions that can be optimized, the properties of available optimization strategies, and the properties of the optimal points are crucial for appropriately modeling practical real world problems as optimization problems. An exact model of the real world that cannot be solved is as useless as a too simplistic model that can be solved easily.
This lecture introduces the basic algorithms, and concepts and analysis tools for several fundamental classes of continuous optimization problems and algorithms. The lecture covers the basics of generic Descent Methods, Gradient Descent, Newton Method, Quasi-Newton Method, Gauss-Newton Method, Conjugate Gradient, linear programming, non-linear programming, as well as optimality conditions for unconstrained and constrained optimization problems. These may be considered as the classical topics of continuous optimization. Some of these methods will be implemented and explored for practical problems in the tutorials.
After taking this course, students will have an overview of classical optimization methods and analysis tools for continuous optimization problems, which allows them to model and solve practical problems. Moreover, in the tutorials, some experience will be gained to implement and numerically solve practical problems.
Oral or written exam depending on the number of students.
Qualification requirements:
Submit solutions for the weekly exercise sheets.
Work in groups of 2 or 3 people.
Present your solutions.
Gain 60% of the total points.
Documentation and Schedule:
Participants of this lecture may download the course material directly from Moodle, including detailed lecture notes (script) and exercise sheets. Please note that the documentation that is provided is not meant to replace the attendance of the lectures and tutorials, that it may contain errors, and that it is not exclusively meant to define the contents of the exam.
The table of contents of the lecture is as follows:
Introduction
Mathematical Optimization
Applications
Performance of Numerical Methods
Existence of a Solution
The Class of Convex Optimization Problems
Unconstrained Optimization
Optimality Conditions
Descent Methods
Gradient Descent Method
onjugate Gradient Method
Newton’s Method
Quasi-Newton Methods
Gauss-Newton Method
Computing Derivatives
Constrained Optimization
Motivation
Optimality Conditions for Constrained Problems
Method of Feasible Directions
Linear Programming
Quadratic Programming
Sequential Quadratic Programming (SQP)
Penalty and Barrier Methods
Literature
The lecture is based on the following literature, which is available via the library:
J. Nocedal und S. J. Wright: Numerical Optimization. Springer, 2006.
F. Jarre und J. Stoerr: Optimierung. Springer, 2004.
D. Bertsekas: Nonlinear Programming. Athena Scientific, 1999.
Y. Nesterov: Introductory Lectures on Convex Optimization - A Basic Course. Kluwer Academic Publisher, 2004.
T. Rockafellar and R. J.-B. Wets: Variational Analysis. Springer-Verlag Berlin Heidelberg, 1998.