Optimization Lecture Notes Fixed
Optimization Lecture Notes Fixed
Section 1
Optimization in one variable involves finding the maximum or minimum value of a function f(x)
by analyzing its critical points and behavior across intervals. This is particularly useful in problems
---
1. **Key Definitions**:
- **Critical Point**: A point x=c is a critical point if f'(c)=0 or f'(c) does not exist. Critical points are candidates
Example: For f(x) = x^3 - 3x^2 + 2, f'(x) = 3x^2 - 6x. Setting f'(x) = 0, critical points are x=0 and x=2.
A point x=c is a **local minimum** if f(c) <= f(x) for all x near c.
Similarly, x=c is a **local maximum** if f(c) >= f(x) for all x near c.
Example: For f(x) = x^2, x=0 is a local minimum because the function dips at that point.
These are the lowest and highest values of f(x) across its entire domain.
---
Page 1
Optimization Methods - Detailed Notes
**Example**:
Note: Simply finding critical points doesn't guarantee they're maxima or minima. We need tests.
Section 2
Optimization in multiple variables is similar but involves partial derivatives and additional tools like the
Hessian matrix.
---
1. **Key Definitions**:
- A **critical point** (x, y) is where all partial derivatives are zero: df/dx = 0 and df/dy = 0.
H=[
[d^2f/dx^2, d^2f/dxdy],
[d^2f/dydx, d^2f/dy^2]
].
---
- D = (d^2f/dx^2)(d^2f/dy^2) - (d^2f/dxdy)^2.
Page 2
Optimization Methods - Detailed Notes
Classification:
---
3. **Example**:
- Critical point: Solve df/dx = 0 and df/dy = 0 to get (x, y) = (2, 1).
- Hessian matrix: [
[2, 0],
[0, 2]
].
Page 3