Optimization for Data Science_lecture1_slides (1)
Optimization for Data Science_lecture1_slides (1)
MAL7070
Syllabus
• Unconstrained Optimization : Convex sets and
functions, Optimality conditions: First order,
second order, line search methods, least squares,
steepest descent, newton method, Quasi-Newton
Method, conjugate gradient methods.
• Constrained Optimization : barrier method,
penalty method, interior point methods, KKT
method and Lagrangian Duality, simplex, Frank
and Wolfe method, applications to dynamic
programming and optimal control.
Time Table
• Optimization for data science is a 2 credits
course.
• Time Table slot : YQ
1. Saturday – 11:30 AM - 1:00 PM
2. Sunday – 11:30 PM – 1:00 PM
Evaluation Scheme
• Major Examination – 50%
• Quizzes – 15 % (Fractal 1)
• Assignment – 10% (Fractal 1)
• Quiz and Assignments – 25% (Fractal 2)
Books
• Amir Beck, Introduction to Nonlinear
Programming (2014), Theory, Algorithms and
Applications with Matlab, MOS-SIAM Series on
Optimization.
• M.S.Bazaraa, H.D. Sherali, and C.M.Shetty (2006),
Nonlinear Programming: Theory and Algorithms,
Third Edition, Wiley.
• Kambo, N. S., Mathematical Programming
Techniques, Second Edition, Affiliated East West
Press, 2005
Introduction to optimization
• X is subset of R^n.
• The above problem must be solved for values of z,
such that it satisfies the constraints and minimizes
objective function.
• Constrained optimization
• Unconstrained optimization
• Linear Programming Problem.
• Non linear Programming Problem.
• A vector satisfying all the constraints of a given
problem is called feasible solution of the problem.
• The collection of all feasible solutions is called
feasible region.
• The minimization problem is to find feasible
solution k such that f(k ) ≤ f(z) for all feasible
solutions z. Such a solution is called optimal
solution.
• Min f(z) = Max (–f(z))
• Max f(z)= Min (-f(z))