0% found this document useful (0 votes)
123 views

Optimization

Optimization techniques aim to find the minimum or maximum value of a function. There are two main approaches: bracketing methods that iteratively narrow the search interval like golden section search, and open methods like Newton's method that directly calculate the optimum point. Golden section search uses the golden ratio to select intermediate points to narrow the interval until the tolerance is reached. Quadratic interpolation fits a quadratic function through three points to approximate the next optimum. Newton's method uses derivatives to quickly converge but may not converge for all functions. A hybrid approach combines bracketing initially followed by Newton's method near the optimum for faster convergence.

Uploaded by

Ali Acıoğlu
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
123 views

Optimization

Optimization techniques aim to find the minimum or maximum value of a function. There are two main approaches: bracketing methods that iteratively narrow the search interval like golden section search, and open methods like Newton's method that directly calculate the optimum point. Golden section search uses the golden ratio to select intermediate points to narrow the interval until the tolerance is reached. Quadratic interpolation fits a quadratic function through three points to approximate the next optimum. Newton's method uses derivatives to quickly converge but may not converge for all functions. A hybrid approach combines bracketing initially followed by Newton's method near the optimum for faster convergence.

Uploaded by

Ali Acıoğlu
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 13

OPTIMIZATION

OPTIMIZATION
Optimization is similar to root finding. Both involve guessing and searching for a point on a function. Optimum is a point where f(x)=0. f(x) indicates whether it is a maximum or minimum. There can be only one global minimum or one global maximum. There can be several local minimum or maximum points.

Some optimization examples: Minimization of the cost of a manufactured part. Maximization of efficiency of a cooling unit.

1D UNCONSTRAINED OPTIMIZATION
Two approaches: Bracketing method (Golden section search, quadratic interpolation) Open methods (Newtons Method)

BRACKETING METHODS
Consider finding the maximum of a function f(x) in [a,b] Consider that the function has only one maximum in [a,b]. We will iteratively narrow the interval. But we need two intermediate points (x1 and x2). If f(x1)>f(x2), the maximum is between [x2,b], otherwise it is between [a,x1].

GOLDEN SECTION SEARCH


In Golden section search these two points are selected as

5 1 0.618034 ... is called the golden-ratio. It is the positive 2 root of r2+r-1=0.

If f(x1)>f(x2) then continue with the interval [x2,b]. Otherwise continue with the interval [a,x1]. This works to locate a maximum. To locate a minimum do the opposite. Calculate two new intermediate points in the narrowed interval and iterate like this. At each iteration, the interval drops by a factor R (di+1=Rdi) Stop when (x1-x2) drops below a specified tolerance.

GOLDEN RATIO
What is the importance of the golden ratio, R=0.618034? Consider the following case

PSEUDOCODE FOR GOLDEN SECTION SEARCH


This code is written to find the maximum. Modify it to find the minimum. What happens if a) The interval [a,b] contains more than one minimum or maximum b) a is the maximum and b is the minimum or vice versa.

EXAMPLE
Find the maximum of f(x)=2x-1.75x2+1.1x3-0.25 x4 using a=-2, b=4

QUADRATIC INTERPOLATION

Based on the fact that a quadratic equation (2nd order) polynomial often provides a good approximation of a function near an optimum point. Select three points (x0,x1,x2) that contains only one optimum. Only one quadratic will pass through these points. Find this quadratic.

Equate its first derivative to zero and its optimum value, x3.
2 2 2 2 f ( x0 )(x12 x2 ) f ( x1 )(x2 x0 ) f ( x2 )(x0 x12 ) x3 2 f ( x0 )(x1 x2 ) 2 f ( x1 )(x2 x0 ) 2 f ( x2 )(x0 x1 )

Narrow the interval by discarding one of the points. Continue with the remaining three points and determine a new optimum point x3. Iterate like this and stop when the approximate error drops below the specified threshold.

EXAMPLE
ter1: x0=-1.75 x1=2.0 x2=2.25 calculate x3=2.0681 ter2: x0=2.0 x1=2.0681 x2=2.25 calculate x3=2.0743 ter3: x0=2.0681 x1=2.0743 x2=2.25 calculate x3=2.0784 ter4: x0=2.0743 x1=2.0784 x2=2.25 calculate x3=2.079 ea=|(x3new-x3old)/x3new|<=0.001

NEWTONS METHOD
We used NR method to find the root of f(x)=0 as xi+1=xif(xi)/f(xi). Similarly the optimum points of f(x) can be found by applying the NR method to f(x)=0 as xi+1=xi-f(xi)/f(xi). This method requires only one starting point, but requires the first and second derivative of f(x). It converges fast, but the convergence is not guaranteed. At the end, f(x) should be checked to determine whether the optimum point is a maximum or minimum. If the derivatives are not known, their approximations can be used. To avoid divergence, it is a good idea to use this method when we are close enough to the optimum point. So we can use a hybrid approach, where we start with a bracketing method and safely narrow the interval, then continue with the NR method.

EXAMPLE
f(x)=2x-1.75x2+1.1x3-0.25x4
f(x)=-3.5+6.6x-3x2 ter 1: x1=x0-f(x0)/f(x0)=2.1042 ea=|(x1-x0)/x1|=0.0693>0.001 ter 2: x2=x1-f(x1)/f(x1)=2.080 ea=|(x2-x1)/x2|=0.0116>0.001 ter 3: x3=x2-f(x2)/f(x2)=2.0793 ea=|(x3-x2)/x3|=0.0003665<0.001 stop

f(x)=2-3.5x+3.3x2-x3
x0=2.25

You might also like