Basic numerical methods for solving scalar nonlinear equations (Newton's method and secant method), local convergence, order of convergence. More advanced methods (Muller's method, inverse quadratic interpolation, Brent's method).
Solution of systems of nonlinear equations, Newton's method, quasi-Newtonian methods. Global convergence, continuation methods.
Theory of unconstrained optimization (necessary and sufficient conditions, role of convexity).
Line search - the search for minima in the given descent direction (Goldstein, Armijo, Wolfe conditions). Basic descent methods (the method of steepest descent and the Newton method), conjugate direction methods (the nonlinear conjugate gradient method),
Quasi-Newton methods (rank-one update, DFP, BFGS, the Broyden family),
Trust-region methods, dogleg.
Least-squares problems (the Gauss-Newton and the Levenberg-Marquart method).
The course deals with the theoretical and practical questions of the numerical solution of non-linear equations and minimization of functionals. The first part is dedicated to the solution of nonlinear equations and their systems, we will focus mainly on Newton's method, its variants and modifications.
The second part deals with the minimization of functionals, focusing on descent methods (e.g. the non-linear conjugate gradient method and quasi-Newtonian methods) and on trust region methods.