# C6.2 Continuous Optimisation - Material for the year 2019-2020

## Primary tabs

Basic linear algebra (such as eigenvalues and eigenvectors of real matrices), multivariate real analysis (such as norms, inner products, multivariate linear and quadratic functions, basis) and multivariable calculus (such as Taylor expansions, multivariate differentiation, gradients).

16 lectures

### Assessment type:

- Written Examination

The solution of optimal decision-making and engineering design problems in which the objective and constraints are nonlinear functions of potentially (very) many variables is required on an everyday basis in the commercial and academic worlds. A closely-related subject is the solution of nonlinear systems of equations, also referred to as least-squares or data fitting problems that occur in almost every instance where observations or measurements are available for modelling a continuous process or phenomenon, such as in weather forecasting. The mathematical analysis of such optimization problems and of classical and modern methods for their solution are fundamental for understanding existing software and for developing new techniques for practical optimization problems at hand.

Part 1: Unconstrained Optimization

Optimality conditions, steepest descent method, Newton and quasi-Newton methods, General line search methods, Trust region methods, Least squares problems and methods.

Part 2: Constrained Optimization

Optimality/KKT conditions, penalty and augmented Lagrangian for equality-constrained optimization, interior-point/ barrier methods for inequality constrained optimization. SQP methods.

Lecture notes will be made available for downloading from the course webpage.

A useful textbook is J.Nocedal and S.J.Wright, *Numerical Optimisation*, (Springer, 1999 or 2006).