0% found this document useful (0 votes)
41 views

Quadratic Interpolation Algorithm For Minimizing Tabulated Function

Uploaded by

KiranKumar S V
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views

Quadratic Interpolation Algorithm For Minimizing Tabulated Function

Uploaded by

KiranKumar S V
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/26620013

Quadratic Interpolation Algorithm for Minimizing Tabulated Function

Article  in  Journal of Mathematics and Statistics · April 2008


DOI: 10.3844/jmssp.2008.217.221 · Source: DOAJ

CITATIONS READS
3 2,111

3 authors, including:

E. A. Youness Samia Hassan


Tanta University, Tanta , Egypt Mansoura University
52 PUBLICATIONS   711 CITATIONS    12 PUBLICATIONS   235 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

New exact solutions for some nonlinear partial differential equations View project

Approximate solutions of boundary value problems View project

All content following this page was uploaded by E. A. Youness on 04 February 2017.

The user has requested enhancement of the downloaded file.


Journal of Mathematics and Statistics 4 (4): 217-221, 2008
ISSN 1549-3644
© 2008 Science Publications

Quadratic Interpolation Algorithm for Minimizing Tabulated Function


1
Youness, E.A., 1S.Z. Hassan and 2Y.A. El-Rewaily
1
Department of Mathematics Faculty of Science, Tanta University, Tanta, Egypt
2
Department of Mathematics Faculty of Education for Girls, Faisal University, KSA.,

Abstract: Problem statement: The problem of finding the minimum value of objective function,
when we know only some values of it, is needed in more practical fields. Quadratic interpolation
algorithms are the famous tools deal with this kind of these problems. These algorithms interested with
the polynomial space in which the objective function is approximated. Approach: In this study we
approximated the objective function by a one dimensional quadratic polynomial. This approach saved
the time and the effort to get the best point at which the objective is minimized. Results: The quadratic
polynomial in each one of the steps of the proposed algorithm, accelerate the convergent to the best
value of the objective function without taking into account all points of the interpolation set.
Conclusion: Any n-dimensional problem of finding a minimal value of a function, given by some
values, can be converted to one dimensional problem easier in deal.

Key words: Quadratic interpolation, tabulated function, trust region, derivative free optimization

INTRODUCTION Table: 1 Initial interpolated set


x x0 x1 …….. xj
f(x) f(x0) f(x1) ……. f(xj)
Many optimization problems can be occur in
practice, for example, in most of labs, one obtains data
for certain phenomena and wants know that the point at Choose x k ∈ I such that f ( x k ) ≤ f ( x ) , for each
which this phenomena is minimized. Also the problems x ∈ I and consider real quadratic convex function φ ( α )
which they arise in economic, social fields, engineering
such that
field and many others fields.
Many researchers had been construct several useful
algorithms for dealing with this kind of problems. ( )
ϕ(α ) = f x k + α p ,
These algorithms, are classified as, algorithms use finite
difference approximations of the objective functions Where:
derivatives[2-4], the second class is pattern search
methods which are based on the exploration of the p ∈ B ⊂ Rn ,
variables’ space using a well specified geometric
pattern, typically a simplex[8]. Finally, algorithms are B is the trust region with radius , i.e,
based on the progressive building and updating of a
model of the objective function[9,10].
There is also a related class of “global modeling
{
B = x∈ Rn : x − xk ≤ ∆ },
methods” that use design of interpolation models[1,6,7].
In the following discussion we present analysis on p is chosen such that f ( p ) ≥ f ( x k ) and f ( p ) < f ( x )
which the proposed algorithm for finding the minimal
point of tabulated function is based. for each x ∈ I, x ≠ x k .
The quadratic convex function φ ( α ) is constructed
RESULTS AND DISCUSSION by interpolating the points α 0 , α1 , α 2 ∈ R such that the
function f is given, at least, at two of α,s. If f is known
Let I be denote to the set of points x ∈ X ⊂ R n at for each α1 , α 2 , α 3 , then we can easy to construct φ ( α ) .
which the function f : X ⊂ R n → R is given in the Table If f is known for only two points of α1, α 2, α 3 then we
1. can assume that the value of f at third point is z which
Corresponding Author: Youness, E.A., Department of Mathematics Faculty of Science, Tanta University, Tanta, Egypt
217
J. Math. & Stat., 4 (4): 217-221, 2008

is calculated from the knowledge of differentiability of Proof: Since x k +1 = x k , so


φ . If f is known only at one point of α, s, then we can α = arg min φ ( α ) . φ ( α ) = f ( x + αp k ) . Also
k
on the
consider one of the other two values (say ϕ(α 2 ) ) is less α∈R

than φ ( α1 ) and φ ( α 3 ) direction pk there is no x̂ such that

Main results: The basic idea of this study is based on ( )


f (x̂) < f x k + αpk because if there is such as that point,
constructing a one dimensional real valued quadratic k
function φ ( α ) by interpolating. The function φ ( α ) is ˆ k
then there is α̂ such that x̂ = x + α p and
constructed such that φ ( α ) = f ( x k + αp ) where x k is the
point in the set I at which the least value of f and the
( ) ( ( ) )
f xk + αˆ pk < f xk + αpk i.e., φ αˆ < φ α
() which is a
point p is chosen as indicated in above. contradiction.
The following results show that the direction p is
Now, let p̂ be another descent direction, p̂ in the
the descent direction and how we can build our
set I such that
algorithm.
Proposition 1: If α is a minimal solution of the f ( pˆ ) > f p k ( ) > f (x ) . k

problem. minϕ(α) and p ∈ I is chosen as indicated


α∈R
Assume x̂ = x k + αˆ pˆ is such that
before. Then
f ( xˆ ) = f (x k + αˆ p) (
ˆ < f x k + α pk . )
f (x k +1
) ≤ f (x )k

By constructing a function Ψ (α) such that


Where x k +1
= x + α p, i.e., p is the descent direction .
k
( )
Ψ(α ) = f x k + α p̂ we get

Proof: Since α is a minimal solution of the problem


min ϕ(α)
Ψ ( α ) = Ψ ( 0 ) + αˆ pˆ ∇f x k + ( ) α2
2
( ) (
pˆ ∇ 2f x k pˆ < f x k + αp k )
α∈R
So Since p̂ is the descent direction and Ψ ( α ) is convex,

ϕ(α ) ≤ ϕ(α ), ∀α ∈ R .
( ) (
f x k = Ψ ( 0 ) < f x k + αp k )
Thus
Which is a contradiction.

( ) ( )
f x k + α P ≤ f x k + αP , ∀α ∈ R . Theorem: The sequence generated by

Hence x k +1 = x k + α pk

(
f xk + αP ≤ f xk ) ( ) is convergent, where α = arg min φ ( α ) and Pk is in the
trust region Bk
α∈R

But f ( x k ) ≤ f ( x ) for each x∈Ι, therefore the Proof: Since the trust region Bk is defined as
direction p is the decreasing direction of f.
1
Bk = x ∈ R n : x − x k ≤ ∆ .
k
Proposition 2: If x k +1 = x k + α p = x k , then there is no
other point x̂ = x k + αˆ p ∈ I such that: This set is closed and bounded. Furthermore

f ( x k + αˆ p ) ≤ f (x k ) . x k +1 = x k + α p k ∈ Bk

218
J. Math. & Stat., 4 (4): 217-221, 2008

Also, 1.5: Determine z that makes ϕ ′(0 ) = 0 and hence


substitute z in φ to obtain φ ( α ) and go to step 1-2.
x k + 2 = x k +1 + αˆ p k +1 ∈ Bk +1 ,
1.6: If two of the points
x1 +αip1 ∉I (say 1x+αlp1 , x1 +αmp1∉I) ,
Where Bk +1 is also closed and B k +1 ∩ Bk ≠ ϕ . then choose
α l , α m such that
By repeating this process, we find the sequence
m
φ ( α l ) < φ ( α i ) , ϕ(α l ) < ϕ(α m ) , ϕ(α m ) = ϕ(α i ) or
{x }
k
is contained Bk ⊃ ∩Bk . Since Bk is closed, it
k =1

contains the limit point of {x k } , thus {x }


k
is
φ( αm ) <φ( αl ) <φ( αi ) , φ( αm ) =φ( αi )
convergent.
and go to step 1.2.
The Algorithm: From the previous discussion we can
seek the following algorithm:
The second step :
Initial step: 2.1: If no improvement in value of f, then opposite the
0.1: Set k=1 direction of P 1to obtain φ ( α ) = f ( x1 − α p1 ) and go to
0.2: Choose x1 ∈ I such that f ( x1 ) ≤ f ( x ) for each x ∈ I step 1. Otherwise go to step 3.
0.3: Let be the radius of the trust region B1,
2.2: If x1 − αp1 does not improve the value of f, then let

{
B1 = x ∈ R n : x − x1 ≤ ∆ } ϕ(α) = f(x1 +α(p1 − x1 ))
0.4: Choose p ∈ B1 such that p ∈ I and f (p1 ) ≥ f (x1 ) ,
1 1
Such that ( p1 − x1 ) ∈ B1 and go to step 1.
Otherwise go to step 3.
( )
f p 1 < f ( x ) , ∀x ∈ I, x ≠ x1
2.3: If ( p1 − x1 ) ∉ B1 , then extend the radius of B1 to

0.5: Set ϕ(α) = f (x1 +αp),α ∈R


contain p1 − x1 and go to step 1.

The third step:


The first Step:
3.1: Inter
1.1: Choose α1 , α 2 , α 3 ∈ R such that
(
x11 = x1 + α p1 or x1 − αp1 or x1 + α ( p1 − x1 ) in the )
ϕ (α1 ) = f (x1 + α1p1 ) set I and reduce the radius of the trust region to become

ϕ (α 2 ) = f (x1 + α 2 p1 ) 1
B2 = x ∈ R n : x − x11 ≤ ∆
2 .
ϕ (α 3 ) = f (x1 + α 3p1 ) .
Hence go to step 0.
1.2: If x1 + α1p1 , x1 + α 2 p1 , x1 + α 3 p1 ∈ I , then
2 1
3.2: If x1 = x1 , then stop and the minimal point is in
interpolate the points α1 , α 2 , α 3 to obtain a quadratic
function φ ( α ) . ∆
the ball B ( x12 ) with radius . Otherwise go to step 0.
α = arg min ϕ (α ) 2
1.3: Determine α ∈R and go to step 2.
Example: Determine the minimal point of the function
1.4: If one of the points x1 + α i p1 ∉ I,i = 1,2,3 then put it f(x) given by the Table 2.
equals z, Table 2: Initial set values of f
x (-2, 0) (1, 1) (0, 1) (1, 0) (0, 2) (1, 2)
i.e., φ ( α j ) = f ( x1 + α jp1 ) = z . f(x) 4 2 1 1 4 5

219
J. Math. & Stat., 4 (4): 217-221, 2008

Choose x1 = ( 0, 1) and ∆ = 2 , then 1


Which it has α = as the minimal point. Then inter the
2

{
B1 = ( x, y) : ( x,y) − ( 0,1) ≤ 2 } point x 2 = 1 , 1
2 2
in the set I with a corresponding

value φ 1 , i.e., the set I becomes


=0
Choose p1 = (1, 0 ) 2

Let:
1 1
x : ( − 2, 0 ) (1, 1 ) ( 0, 1 ) , (1, 0 ) ( 0 , 2 ) (1, 2 )
ϕ(α ) = f [(0, 1) + α(1, 0 )] = f (α, 1) . 2 2
f(x ) : 4 2 1 0 1 4 5

Take α1 = 0, α 2 = 1, α3 = −1 , then Reduce the trust region to become


ϕ (α1 ) = 1 , ϕ (α 2 ) = 2 . set ϕ (α 3 ) = z
1 1 1 3
B2 = (x , y ) : ( x , y ) − , ≤ ∆=
By using Maple Package we can obtain 2 2 2 2

α = 0 = arg min ϕ (α )
ϕ (α ) = α 2 + 1 and α ∈ R . And choose p 2 ∈ B2 , p 2 ∈ I ∪
1 1
, , p 2 = (1, 0) . By
2 2
Thus f ( α, 1)α = 0 = f ( 0, 1) = 1 , i.e., no improvement in the using Maple we get φ ( α ) = f α + 1 , 1 , φ (α ) = α2 ,
2 2
value of f, then consider the opposite direction
−p = ( −1, 0 ) which has the minimal point α=0.
Now no improvement, then consider the opposite
direction –p = (-1,0) for which
Therefore,
1 1
φ ( α ) = f ( −α, 1) φ (α) = f − α,
2 2

Which by interpolating the points


and for the points
α : α1 α2 α3 α : −1 0 1
j(α) : 1 2 z
ϕ (α ) : 1 0 1
and by using Maple Package we get the function ϕ(α) = α 2 and also no improvement.
1 −1
φ (α ) = α2 + 1 Hence consider the direction p 2 − x 2 = , . By
2 2
Which it has α = 0 as the minimal point and no using Maple we get φ ( α ) = α 2 and z = 1 and no
improvement in the value of f. Therefore we will
consider the direction p1 − x1 , such that p1 − x1 ∈ B1 . improvement in the consider three direction, then stop
1 1
Since p1 − x1 = (1, − 1) ∉ B1 , so extend the radius of B1 to and the minimal point is in the ball B , with
2 2
be =3 . In this case we option
3
radius .
φ ( α ) = f ( α, 1 − α ) . 2
CONCLUSION
1
Since α1 = 0 implies ( 0, 1) ∈ I , then choose α 2 = and The presented algorithm in this paper enables us to
2
find the point at which the value of objective function,
α3 = 1 with letting corresponding values of ϕ(α) as 0,1 in optimization, problem is the best value when some
respectively. values are known at some points. The idea of this
By interpolating ϕ(α) we obtain algorithm is interpolating the points to one dimensional
quadratic function from which we can obtain the
φ ( α ) = 4α 2 − 4α + 1 desired point.
220
J. Math. & Stat., 4 (4): 217-221, 2008

REFERENCES 7. Morris, M., C. Currin, T.J. Mitchell and D.


Ylvisaker, 1991. Bayesian prediction of
1. Conn, A.R. and P.L. Toint, 1996. An Algorithm deterministic functions with application to the
Using Quadratic Interpolation for Unconstrained design and analysis of computer experiments. J.
Derivative Free Optimization. In: Nonlinear Am. Stat. Associat., 86: 953-963.
Optimization and Applications, Giannessi, F., citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1
(Ed.). Plenum Publishing, Dipillo, Gianni, pp: 27- .8.1552
47. 8. Nelder, J.A. and R. Mead, 1965. A simplex method
2. Dennis, J.E. and R.B. Schnabel, 1983. Numerical for function minimization. Comput. J., 7: 308-313.
Methods for Unconstrained Optimization and http://doi.acm.org/10.1145/203082.203090.
Nonlinear Equations. Prentice-Hall, Englewood 9. Powell, M.J.D., 1994. A Direct Search
Cliffs, USA., Optimization Method that Models the Objective
3. Gill, P.E., W. Murray, and M.H. Wright, 1981. and Constraint Functions by Linear Interpolation.
Practical Optimization. Academic press, London
and New York, pp: 402. In: Advances in Optimization and Numerical
4. Gill, P.E., W. Murray, and M.A. Saunders and M. Analysis, Gomez, S. and J.P. Hennart, (Eds.).
Wright, 1983. Computing forward difference Kluwer Academic Publishers, pp: 275.
intervals for numerical optimizations. SIAM J. Sci. 10. Powell, M.J.D., 2001. On the lagrange functions of
Stat. Comput., 4: 310-321. quadratic models that are defined by interpolation:
http://dx.doi.org/10.1137/0904025. Optim. Methods Softw., 16: 289-309. DOI:
5. Kolda, T.G., 2003. Optimization by direct search:
10.1016/S0096-3003(01)00073.
New perspective on some classical and modern
methods. SIAM Review, 45: 385-482. 11. Torczon, V., 1997. On the convergence of pattern
DOI: 10.1137/S003614450242889. search algorithms. SIAM J. Optim., 7: 1-25. DOI:
6. Mitchell, T.J., J. Saks, W.J. Welch and H.P. Wynn, 10.1137/S1052623493250780.
1998. Design and analysis of computer
experiments. Stat. Sci., 4: 409-435.
http://www.cant.ua.ac.be/modelbenchmarks/sacks
welchmitchelwynn.pdf.

221

View publication stats

You might also like