0% found this document useful (0 votes)
101 views

Lecture 01

This document provides an overview of Professor Michael Ferris' CS 726 course on nonlinear optimization. It outlines course details like the final exam date and grading breakdown. It also summarizes key concepts in nonlinear optimization, such as defining optimization problems, constrained vs unconstrained problems, continuous vs discrete problems, and local vs global optima. Various algorithmic approaches are briefly introduced.

Uploaded by

Harris
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
101 views

Lecture 01

This document provides an overview of Professor Michael Ferris' CS 726 course on nonlinear optimization. It outlines course details like the final exam date and grading breakdown. It also summarizes key concepts in nonlinear optimization, such as defining optimization problems, constrained vs unconstrained problems, continuous vs discrete problems, and local vs global optima. Various algorithmic approaches are briefly introduced.

Uploaded by

Harris
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

CS 726: Nonlinear Optimization 1

Lecture 01 : Introductory Lecture

Michael C. Ferris

Computer Sciences Department


University of Wisconsin-Madison

January 25 2021

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 1 / 16


Orientation
Course website: https://canvas.wisc.edu/courses/230937
The recommended text is Numerical Optimization by Nocedal and
Wright.
There will be exercises in Python.
The final exam is on May 7th, 12:25pm - 2:25pm. It is required. If
you cannot make the exam, drop the course.
Other details are in Course Syllabus on Canvas.
I No midterm exam: Project instead.
I See the Canvas website for more details.
There will be approximately 10 homeworks. Do not expect much
feedback beyond the grade.
Prof. Ferris will not respond to emails unless they are for personal
matters unrelated to the coursework. For coursework questions,
please ask using piazza via Canvas (or at: https:
//piazza.com/wisc/spring2021/sp21compsci726001/home)
Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 2 / 16
Lectures and Office Hours

Via zoom (can link via Canvas)


Class meeting ID: 821 4396 7938 (with passcode: 726)
Office hours meeting ID: 824 9070 7615 (with passcode: 726)
Piazza: need to create an account from the piazza link on canvas
(passcode: 726)

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 3 / 16


Grading

(As detailed on Canvas)


30% Final Exam
15% Project
50% Homework
5% Class participation, including questions, breakout discussion,
kahoot quizzes, Piazza

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 4 / 16


Optimization

There is an objective (function) which we are seeking to maximize or


minimize described by:

f : S ⊆ Rn → R ∪ {+∞/ − ∞}

Objective is a function of variables (or unknowns) f (x) where


x ∈ Rn .
Variables are subject to a system of equations Ax = b where A and
b are the data or parameters which describe the problem.
The constraints of the program consist of Ax = b and the
non-negativity constraints x ≥ 0.

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 5 / 16


This all entails a program of the form

minimize f (x)
subject to Ax = b
x ≥0

The constraints can be rewritten as

hi (x) = 0 ∀i ∈ ε where hi (x)= Ai x − bi


gi (x) ≤ 0 ∀i ∈ I where gi (x) = − xi

In this form, the feasible set is described by

Ω = {x|hi (x) = 0 ∀i ∈ ε, gi (x) ≤ 0 i ∈ I }

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 6 / 16


Maximization function and its
Michael C. Ferris (UW-Madison) corresponding
CS726:Lecture 01 minimization function.
Introductory Lecture 7 / 16
Any maximization function can be rewritten as a minimization
function by
max f (x) = − min −f (x)
x∈Ω x∈Ω

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 8 / 16


Continuous Vs. Discrete

In a discrete problem, only the points would be feasible. In a continuous problem,


the whole shaded region is feasible.

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 9 / 16


If Ω has a finite number of points then we have a discrete problem.
Covered in CS 728.
I Key point is that in discrete optimization the feasible set consists of
isolated points.
Continuous problems are often easier to solve (because of the power
of calculus).
I Sometimes methods from continuous optimization are useful in solving
discrete problems.

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 10 / 16


Constrained Vs. Unconstrained

Unconstrained problems have Ω = Rn which is the whole space


What about Ω 6= Rn ?
I We will consider Ω = C , where C is closed, convex polyhedral set.
I We will not be covering nonsmooth functions. This is covered in CS
727 (especially convex functions).
I More advanced sets are not covered in this course. This is covered in
CS 730.
Constrained problems can be treated in various ways. More detail
covered in CS 730, including nonlinear, nonconvex and convex cones
for example. In this course we will just cover simple convex sets and
apply penalty methods to more complex ones.

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 11 / 16


i.e. to produce a negativity constraint we can penalize by adding
α k(−x)+ k where (x)+ = max(x, 0). This gives the final objective of

min Pα (x) = f (x) + α k(−x)+ k


x∈Rn

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 12 / 16


Linear vs. Nonlinear

If f , gi , hi are affine then we have the linear program. Covered in CS


525 (but some basic summary in Math Background document).

min f (x)
s.t. gi (x) ≤ 0
hi (x) = 0

I Affine functions are linear functions that have an added offset. So if Ax


is a linear function, then an affine function is Ax + b
Linear problems tend to come from the decision sciences whereas
nonlinear problems often arise from physical systems.

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 13 / 16


Global vs Local
We define the notion of local and global minimizers, and local and global
solutions (see Background Lecture Details document).

The local minimum is clearly a minimum only within its neighborhood.

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 14 / 16


Stochastic vs Deterministic

A problem is stochastic if data is not known beforehand. It may arise


from some known distribution or assumed via statistical
measurements. Covered in CS 719. Some material on stochastic
algorithms will be covered here.
Aij ∼ N(Āij , 1)

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 15 / 16


Algorithms

Iterative algorithms: generate a series of points which hopefully


converge to the solution.
I Typically assume the functions f ∈ C 1 or C 2

Michael C. Ferris (UW-Madison) CS726:Lecture 01 Introductory Lecture 16 / 16

You might also like