Project Description
Learning objectives
Course content
Introduction to the optimization: model approach; Optimization problems: classification; Mathematical computation problems: conditions for the existence of the solution (chapter 1 of the textbook except paragraph 1.5.3, 1.5.5 and 1.5.6); Unbound optimization: optimization conditions of solution algorithms: global merging conditions, merging methods with unbound unidimensional researches, unidimensional research. Gradient method. Newton method. Decomposition methods: introduction, sequential and parallel algorithms. Gauss Seidel method, Gauss Soutwell method, block descent method, brief introduction to the decomposition with block overlapping. Jacobi method. Unbound optimization: neural network training. Proof of perceptron algorithm convergence. Bound optimization: optimization conditions and solution algorithms. Optimization analytical conditions: Fritz John conditions, constraint qualification (linear independence, gradient, equality and active constraints only). Wolfe duality: general case and mean-squared error. Linear and non-linear SVM. Optimization conditions and SVM_light.
6 CFU-program:
Introduction to optimization: mathematical modeling approach. Classification of mathematical programming problems. Existence of solutions.
Unconstrained optimization: optimality conditions, solution algorithms: global convergence conditions. Linesearch. Gradient method, Newton method. Decomposition methods: Gauss Seidel method, Gauss Southwell method, block descent method, Jacobi method.
An application:perceptron, multilayer neural networks, RBF networks. Training of neural networks via unconstrained optimization.
Constrained optimization: optimality conditions and solution algorithms.
Wolfe duality: special case of quadratic optimization. Linear and non linear Support Vector Machines (SVM). Optimality conditions and training of SVMs: SVMlight.