0% found this document useful (0 votes)
14 views1 page

Index CODaniel Dadush

The document outlines the Continuous Optimization course, detailing prerequisites, course aims, learning goals, and a schedule of lectures. Students are expected to have a solid foundation in linear algebra, multivariate analysis, and optimization techniques. The course will culminate in a final exam, with specific rules regarding materials allowed and exam signup procedures.

Uploaded by

c.a.tadei
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views1 page

Index CODaniel Dadush

The document outlines the Continuous Optimization course, detailing prerequisites, course aims, learning goals, and a schedule of lectures. Students are expected to have a solid foundation in linear algebra, multivariate analysis, and optimization techniques. The course will culminate in a final exam, with specific rules regarding materials allowed and exam signup procedures.

Uploaded by

c.a.tadei
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Mastermath

Continuous Optimization - M1 - 6EC


COf22

This file is part of the content downloaded from Continuous Optimization - M1 - 6EC.
Course summary

Continuous Optimization 2022


Prerequisites
The student should have a solid bachelor level knowledge linear algebra and multivariate analysis. The student should also have knowledge of linear optimization and convex analysis to the level of being able to follow the text and do the exercises from the
following:

Linear Programming, A Concise Introduction, Thomas S. Ferguson:


Available at https://www.math.ucla.edu/~tom/LP.pdf
Chapters 1 and 2, along with the accompanying exercises.
Convex Optimization, Stephen Boyd and Lieven Vandenberghe:
Available at http://stanford.edu/~boyd/cvxbook/
Sections: 2.1, 2.2 and 3.1.
Exercises (from the book): 2.1, 2.2, 2.12, 3.1, 3.3, 3.5 and 3.7

Aim of the course


Continuous optimization is the branch of optimization where we optimize a (differentiable) function over continuous (as opposed to discrete) variables. Here the variables can be constrained by (differentiable) equality and inequality constraints as well as by convex
cone constraints. Optimization problems like this occur naturally and commonly in science and engineering and also occur as relaxations of discrete optimization problems. Differentiability of the functions defining the problems allows for the use of multivariable
calculus and linear algebra techniques to study the problems and to design and analyze efficient algorithms.

This course aims to provide a concise introduction into the basics of continuous unconstrained, constrained and conic optimization.

Learning goals
The student will be able to:

Prove results on (convex) optimization problems.


Solve the KKT conditions for basic constrained optimization problems.
Be able to formulate the Lagrange dual, and understand/prove basic results on these problems.
Give both sufficient and necessary optimality conditions for constrained continuous optimization problems.
Use a range of techniques to solve both unconstrained and constrained continuous optimization problems, and prove results on these techniques.
Formulate and recognize conic optimization problems, along with being able to construct their dual problems.

Student Forum (Forum)


Announcements (Forum)
7,8 Continuous Optimization fall 2021 (File)

Schedule

Zoom Link:
https://cwi-nl.zoom.us/j/82530291370?pwd=aXp3OVdSNEJtbUxlSkg4RGZDcVdTdz09

Meeting ID: 825 3029 1370


Passcode: 060140

Rules about Homework/Exam

The course will be assessed by a final exam.

Exam Rules:
You are responsible for reading and understanding the full list of rules in "Exam Rules.pdf" attached below. Some important rules are:

Cheat Sheet: You may bring a cheat sheet, consisting of 2 A4 pages, front to back (you may write whatever you want on the sheet). No other materials are allowed.

Answers: Unless otherwise specified, you may use any fact from the slides or homework exercises in your answers.

Exam Signup: To ensure sufficient copies of the exam, and sufficient exam paper, you are encouraged to use the signup forms for the main and retake exam. If we are running out of exams or paper, priority will be
given to those who signed up.

Main Exam Signup: https://forms.gle/1yVLV4sVeQNdZxQDA

Retake Exam Signup: https://forms.gle/EMwjCE4dPBZTFsMe8

Exam Rules (File)

Lecture Notes/Literature

Lecture slides will be provided online during the course.

Video recordings of the lectures will be made available here:


https://vimeo.com/showcase/9813388
password: g7H7

The course will also rely on the following source material:

Boyd & Vandenberg CVX Book:

[BV] https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf

Nemirovsky, Ben-Tal Course on Convex Optimization:

[NB] Lectures on Modern Convex Optimization:


https://www2.isye.gatech.edu/~nemirovs/LMCOBookSIAM.pdf

[NB+] Lectures Notes on Optimization:


https://www2.isye.gatech.edu/~nemirovs/OPTIIILN2022.pdf

12/09. Lecture 1: Introduction

Reading: [BV] Chapter 2.1-2.3

19/09. No class.

26/09. Lecture 2: Convex Sets and Convex Functions

Reading: [BV] Chapter 2.5, 3.1-3.3, [NB+] Chapter 1.2.6,1.2.8, 2

04/10. Lecture 3: Linear Classification and the Frank-Wolfe Algorithm

Reading: [BV] Chapter 8.6

11/10. Lecture 4: Lagrangian Duality

Reading: [BV] Chapter 5.2

24/10. Lecture 5: Interpretations of Lagrange Duality and the KKT Conditions

Reading: [BV] Chapter 5.3,5.4.,5.5,5.6

31/10. Lecture 6: Optimization with Generalized Inequalities

Reading: [BV] Chapter 2.6, 5.9

7/11. Lecture 7: Applications of Convex and Conic Programming

Reading: [BV] Chapter 4.6.3, 8.2,8.4,8.7,8.8

14/11. Lecture 8: Topics in Semidefinite Programming

Reading: [BV] Appendix A.5, B.1.

21/11. Lecture 9: Steepest Descent Methods

Reading: [BV] Chapter 9.1,9.2,9.3,9.4

28/11. Lecture 10: Stochastic Gradient Descent & Neural Network Training

Reading: Raghu Meka's lecture notes (https://raghumeka.github.io/CS289ML/gdnotes.pdf)

5/12. Lecture 11: Projected Gradient Descent & Convergence of Newton's Method

Reading: [BV] Chapter 9.5

12/12. Lecture 12: Self-concordance and Interior Point Methods

Reading: [BV] Chapter 9.6, 11.1,11.2,11.3,11.4,11.5

19/12. Lecture 12 continued + Exam Review

Reading: same as previous lecture.

Lecture 1 Slides (with notes) (File)


Lecture 1 Exercises (File)
Lecture 1 Exercises Solutions (File)
Lecture 2 Slides (with notes) (File)
Lecture 2 Exercises (File)
Lecture 2 Exercises Solutions (File)
Lecture 3 Slides (with notes) (File)
Lecture 3 Exercises (File)
Lecture 3 Exercises Solutions (File)
Lecture 4 Slides (File)
Lecture 4 Exercises (File)
Lecture 4 Exercises Solutions (File)
Lecture 5 slides (with notes) (File)
Lecture 5 Exercises (File)
Lecture 5 Exercises Solutions (File)
Lecture 6 slides (with notes) (File)
Lecture 6 Exercises (File)
Lecture 6 Exercises Solutions (File)
Lecture 7 Slides (with notes) (File)
Lecture 7 Exercises (File)
Lecture 7 Exercises Solutions (File)
Lectures 8 Slides (with notes) (File)
Lecture 8 Exercises (File)
Lecture 8 Exercise Solutions (File)
Lecture 9 Slides (with notes) (File)
Lecture 9 Exercises (File)
Lecture 9 Exercises Solutions (File)
Lecture 10 Slides (with notes) (File)
Lecture 10 Exercises (File)
Lecture 10 Exercises Solutions (File)
Lecture 11 Slides (with notes) (File)
Lecture 11 Exercises (File)
Lecture 11 Exercises Solutions (File)
Lecture 12 Slides (with notes) (File)
Lecture 12 Exercises (File)
Lecture 12 Exercises Solutions (File)
Exam Review (with notes) (File)
Practice Exam (File)
Practice Exam Solutions (File)

This file is part of the content downloaded from Continuous Optimization - M1 - 6EC by Carlo Alberto Tadei on Saturday, 24 August 2024, 12:37 AM.

You might also like