0% found this document useful (0 votes)
11 views

Optimization for Data Science_lecture1_slides (1)

The document outlines a syllabus for a 2-credit course on Optimization for Data Science, covering unconstrained and constrained optimization methods, including various algorithms and applications. It includes a timetable, evaluation scheme, and recommended textbooks. The introduction emphasizes the importance of optimization in maximizing profit or minimizing costs under given constraints.

Uploaded by

unkown21
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Optimization for Data Science_lecture1_slides (1)

The document outlines a syllabus for a 2-credit course on Optimization for Data Science, covering unconstrained and constrained optimization methods, including various algorithms and applications. It includes a timetable, evaluation scheme, and recommended textbooks. The introduction emphasizes the importance of optimization in maximizing profit or minimizing costs under given constraints.

Uploaded by

unkown21
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Optimization for Data Science

MAL7070
Syllabus
• Unconstrained Optimization : Convex sets and
functions, Optimality conditions: First order,
second order, line search methods, least squares,
steepest descent, newton method, Quasi-Newton
Method, conjugate gradient methods.
• Constrained Optimization : barrier method,
penalty method, interior point methods, KKT
method and Lagrangian Duality, simplex, Frank
and Wolfe method, applications to dynamic
programming and optimal control.
Time Table
• Optimization for data science is a 2 credits
course.
• Time Table slot : YQ
1. Saturday – 11:30 AM - 1:00 PM
2. Sunday – 11:30 PM – 1:00 PM
Evaluation Scheme
• Major Examination – 50%
• Quizzes – 15 % (Fractal 1)
• Assignment – 10% (Fractal 1)
• Quiz and Assignments – 25% (Fractal 2)
Books
• Amir Beck, Introduction to Nonlinear
Programming (2014), Theory, Algorithms and
Applications with Matlab, MOS-SIAM Series on
Optimization.
• M.S.Bazaraa, H.D. Sherali, and C.M.Shetty (2006),
Nonlinear Programming: Theory and Algorithms,
Third Edition, Wiley.
• Kambo, N. S., Mathematical Programming
Techniques, Second Edition, Affiliated East West
Press, 2005
Introduction to optimization

• Optimization is the act of obtaining the best result


under the given circumstances.
• Minimize the effort ( cost ) or Maximize the output (
Profit ).
• Cost or Profit is always a function of certain variables.
Our aim is to maximize or minimize this function.
• There is no single method available for solving all
optimization problems efficiently. Hence, a number of
optimization methods have been developed for solving
different types of optimization problems.
Example
A company makes two types of circuits. They
need Capacitors, Resistors and transistors for
making these circuits. The more information is
given in the following table:
Circuit 1 Circuit 2 Stock
C 4 0 16
R 2 2 12
T 2 4 20
profit 2 3

How this company can maximize their profit?


• Consider the following problem
Minimize f(z) Objective function
Subject to
g_i(z) ≤ 0 , i = 1, 2, ... , m Inequality Constraints
l_j(z) = 0,j = 1, 2, ... r equality Constraints
z ∈ X.

• X is subset of R^n.
• The above problem must be solved for values of z,
such that it satisfies the constraints and minimizes
objective function.

• Constrained optimization
• Unconstrained optimization
• Linear Programming Problem.
• Non linear Programming Problem.
• A vector satisfying all the constraints of a given
problem is called feasible solution of the problem.
• The collection of all feasible solutions is called
feasible region.
• The minimization problem is to find feasible
solution k such that f(k ) ≤ f(z) for all feasible
solutions z. Such a solution is called optimal
solution.
• Min f(z) = Max (–f(z))
• Max f(z)= Min (-f(z))

You might also like