Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
78 views
Multi Unconstrained Optimisation
Uploaded by
Subhajyoti Das
AI-enhanced title
Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Multi unconstrained optimisation For Later
Download
Save
Save Multi unconstrained optimisation For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
78 views
Multi Unconstrained Optimisation
Uploaded by
Subhajyoti Das
AI-enhanced title
Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Multi unconstrained optimisation For Later
Carousel Previous
Carousel Next
Save
Save Multi unconstrained optimisation For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 12
Search
Fullscreen
Analytical Multidimensional (Multivariable) Unconstrained Optimization Learning Objectives After studying this chapter, you should be able to: 1. Describe the method of solving problems of multivariable unconstrained optimization 2. Classify the multivariable optimization problems. 3. Explain the necessary and sufficient conditions for solving multivariable unconstrained optimization problems. 4, Solve unconstrained multivariable functions. In Chapter 5, we formulated the single variable optimization problems without constraints. Now let us extend those concepts to solve multivariable optimization problems without constraints. The optimization of such problems is routed in more than one direction. For instance, if there are two variables in the objective function, we call such problems two-dimensional as we have to search from two directions. Similarly, a three-variable problem can be called three-dimensional, and so on. However, the method of solving is similar in all such cases, and, therefore, all thes? problems can be named multidimensional ot multivariable unconstrained optimization problems Recollect that we made use of complete differentiation to solve single variable unconstrai optimization problems. Here we use the concept of partial differentiation in solving multivariable unconstrained optimization problems using analytical methods. 6.1 Classification of Multivariable Optimization Problems The multivariable optimization problems can be classified broadly into two catego, multivariable unconstrained optimization, i.e. without constraints, and multivariable const@i™ optimization, i.e. with constraints. Further, since we categorize the constraints into 1¥° = 114Analytical Multidimensional (Multivariable) Uncomstrained Optimization 145 such as equality and inequality type we can further classify the multivariable constrained optimization problems into two types. Summarily, the classification of multivariable optimization problems is shown in Figure 6.1. Multivariable optimization problerns Multivariable Multivariable constrained optimization, i.e, without constraints i.e. with constraints Equality constraints Inequality constraints Figure 6.1 Classification of multivariable optimization problems. In this chapter, we discuss the analytical solution methods for multivariable unconstrained optimization. 6.2 Optimization Techniques to Solve Unconstrained Multivariable Functions Let us now discuss the necessary and sufficient conditions for optimization of an unconstrained (without constraints) multivariable function for which we make use of Taylor’s jes expansion of a multivariable function. Taylor’s theorem/series expansion is given in Exhibit 6.1. Exhibit 6.1 Taylor’s Theorem/Series Expansion Taylor’s Series In the Taylor’s formula with remainder (Eqs. (i) and (ii), if the remainder R,(x) > 0 as n >, then we obtain as 7 ine f0)= f+ 2 a2 HP par EF faye Wi [Which is called the Taylor’s series. When a = 0, we obtain the Maclaurin’s series 2 7 2 q as is assumed that f(x) has continuous derivatives up to (n + S)th order, /” * M(x) is in the interval (a, x), Hence, to establish that lim | R, (2) |= 0, it is sufficient to show nm that fim 2— at = Geli =0 for any fixed numbers x and a, Now, for any fixed numbers x and a, Sa) = f0)+=7'O) + L140) fee (i)eo7- Analytical Multidimensional (Multivariable) Unconstrained Optimization 117 6.3 Necessary Condition (Theorem 6.1) If fx) has an extreme point (maximum or minimum) at x = x” and if the first derivative of fox) exists at x°, then He’) _ He’) _ _ Ao" Ox, Ox, ox,64 Sufficient Condition (Theorem 6.2) ‘Acsufficient condition for a stationary point x” to be an extreme point is that the matrix of second partial derivatives (Hessian matrix) of f(x) evaluated at x" is: (i) Positive definite when x* is a relative minimum point. (i) Negative definite when x* is a relative maximum point.6.5 Working Rule (Algorithm) for Unconstrained Multivariable Optimization From Theorems 6.1 and 6.2, we can obtain the following procedure to solve the problems of multivariable functions without constraints. Step 1: Check the function to ensure that it belongs to the category of multivariable unconstrained optimization problems, i.e. it should contain more than one variable, SAY, X,, Xp... X, and no constraints. Step 2: Find the first partial derivatives of the function wart. x1, x9, ...5 X,-step 3 step 4 step 5¢ Step 6: Step 7: Analytical Multidimensional (Multivariable) Unconstrained Optimization 119 Equate all the a partial derivatives (obtained from Step 2) to zero to find the values fps 2525 yt 3 Find all the second partial derivatives of the function. Prepare the Hessian matrix. For example, if we have two variables x,, x,, in f(x), then the Hessian matrix is er _#s J ax? = ax ax ra) ae ay Ox Ox. Oxy Find the values of determinants of square sub-matrices Ji Jy» ” 2 In our example, J, --4 as er es = y-| Bas af oF axjax, ax? Evaluate whether x;, x3 are relative maxima or minima depending on the positive or negative nature of J,, J, and so on. Thus, the sign of J, decides whether J is positive or negative while the sign of J, decides definite or indefiniteness of J (Table 6.1). More clearly, If (i) J, > 0, J,> 0, then J is positive definite and x" is a relative minimum. (i) J, > 0, J, <0, then J is indefinite and x" is a saddle point. (ii) J, <0, Fe > 0, then J is negative definite and x” is a relative maximum. (iv) J, <0, z <0, then J is indefinite and x* is a saddle point. Table 6.1 presents the sign notation of J. Table 6.1 Sign Notation of J negative or J, <0 (is negative) J, positive orJ,>0 | Jis positive definite andx" | Jis negative definite and is definite) is a relative minimum ¥ isa relative maximum Jznegative or J,<0 | Sis indefinite and.x’ isa | Jis indefinite and x" isa Vis indefinite)” saddle point saddle point120 Optimization Methods for Engineers Note: If in a function of two variables f(x,, *2) the Hessian — is neither positive 7 negative definite, then the point (x1, 33) at which a//x, = 2/10 is called the saddle yy, which means it may be a maximum or minimum with one variable when the other js fixed.” Ilustration 6.2 Determine the extreme points as well as evaluate the following function fo) = xB + xB + 2} + ay + 6 Solution The necessary conditions for the existence of an extreme point are af | ae 0 and ae 0 of Z as = 3x? + 4x, = 23x, + 4) = 0 i of aes 3x? + 8x, = x,3x, + 8) = 0 (i) From Eq. (i), we get x, = 0 or -4 And from Eq. (ii), we get x, = 0 or + Hence, the above equations are satisfied by the following points: (©, 0), (0, -8/3), (4/3, 0) and (-4/3, 8/3) Now, let us find the nature of these extreme points for which we use the sufficiency conditions, i.e. by the second order partial differentiation of f. 2 os = 6x, +4 ax 2 f > = 6x, + ae at 8 ef dxjax, The Hessian matrix of f is given by pe 6x, +4 0 0 6x, +8 Hence, J, = (6x, + 4] and ji pae eae 0 6x, +8Analytical Multidimensional (Multivariable) Unconstrained Optimization 121 The values of J, and J, and the nature of the extreme points are given in Table 6.2. Table 6.2 Sign Notation of J in Illustration 6.2 Pointy Valueof Value of Nature of 7 Nature of x i) I ta (0,0) +4 +32 Positive definite Relative minimum 6 (0,-8/3) +4 ~32 Indefinite Saddle point since 418/27 Ofldx,= af/dx, (4/3, 0) 4 32 Indefinite Saddle point 194/27 (43, -8/3) 4 +32 Negative definite Relative maximum 50/3 of aq 728A 0 = H=0 ef ax? Ber 0 Therefore, the Hessian matrix J = (° 4 Thus at x, = 0, x, = 0. 0 2 Since J, is positive (+2) and J, is negative (~4), the nature of J'is indefinite and the point at Gas "03 is a saddle point, and also the value the function at this saddle point is (0,0) = 0. We have J, = |2| = 2 and J, = | =-4 qs Note: At the saddle point if one of the values of x, or x, is fixed, the other may show extremity.Ilustration 6.5 Two frictionless rigid bodies R, and R, are connected by three linear elastic springs of constants k,, k, and k, as shown in Figure 6.3. The springs are at their natural positions when the applied force P is zero. Find the displacements x, and x, under the force of 26 units by using the principle of minimum potential energy assuming the spring constants fy, ky and k, as 2, 3 and 4 respectively. Examine whether the displacements correspond to the minimum potential energy.
0Analytical Multidimensional (Multivariable) Unconstrained Optimization 127 4) 6 =42+16=58>0 Since the values of J are positive, the matrix J is positive definite and hence the values ay x5 correspond to the minimum potential energy. and and xy =4 and x3 = 7. Illustration 6.6 The profit on a contract of road laying is estimated as 26x, + 20x, + 4x,x, = 3xj ~ 4xj, where x, and x, denote the labour cost and the material cost respectively. Find the values of labour cost and material cost that can maximize the profit on the contract. Solution Given, profit Z(x,, x,) = 26x, + 20x, + dx,xy — 3x? - 4x? Applying the necessary conditions, ie. a a, ax, Oxy az M By 26 te 6H =0 @ and Seaman -8x, =0 () On solving Eqs. (i) and (ii), we get x? = 9, x} = 7. Now, let us apply the sufficiency condition to maximize the profit (Z) at (xf, x1). This is verified by testing the negative definiteness of the Hessian matrix. The Hessian matrix of Z at (31, 23) is calculated as 4 az 4 =-6, =-8 and ax? ax? Ax,dx, si were pte elon . Hessian matrix ly The determinants of the square sub-matrices of J are -~6|=-6<0 48-16=32>0 Since J, is negative (i.e. -6) and J, is positive (i.e. +32) which is definite, the nature of J is "tative definite. Hence, the values of x, and x, show a relative maximum for Z at x and x3 Tespectively, 4 =O ET128 Optimization Methods for Engineers Hence max profit, Zax. = 20(9) + 20(7) + 4(9(7) - 309 — 4¢7 = 234 + 140 + 252 — 243 — 196 = 187 units Summary aaa Multivariable optimization problems without constraints are explained in this chapter, The optimization problems of multivariable functions without constraints make use of Taylor's series expansion of multivariable functions and partial differentiations. The basic idea behind the solution methods of these problems is trying to convert them nearest to the single variable problems, i.e. the partial differentiation which is the differentiation of one variable at a time keeping the other variables fixed. However, the mathematical concepts of matrices, calculus and other simple algebraic principles are prerequisites to these problems. We will take up the multivariable constrained problems in the next chapter. Key Concepts Multidimensional optimization: The optimization problems routed in more than one direction, ic. having more than one variable. Saddle point: If there is a function of two variables /(x,, x,) whose Hessian matrix is neither positive definite nor negative definite, then the point (x;, x3) at which 9f/ax," = 9f/2x) is called the saddle point. Hessian matrix: J, _ ,. = [0°//2xx,_,.] is the matrix of the second partial derivative. Positive definite: If J, > 0, J,> 0, then J is positive definite and x" is a relative minimum. Positive indefinite: If J,> 0, J,< 0, then J is indefinite and x" is a saddle point. Negative definite: If J, < 0, J, > 0, then J is negative definite and x* is a relative maximum. Negative indefinite: If J, < 0, J, < 0, then J is indefinite and x* is.a saddle point,
You might also like
Stat Fiai 2 Sampel
PDF
No ratings yet
Stat Fiai 2 Sampel
6 pages
Approximation PDF
PDF
No ratings yet
Approximation PDF
26 pages
Ch18 LT Instructor Solution
PDF
No ratings yet
Ch18 LT Instructor Solution
100 pages
Roots of Equations
PDF
No ratings yet
Roots of Equations
108 pages
Method of Multiple Scales and Averaging
PDF
No ratings yet
Method of Multiple Scales and Averaging
48 pages
EngMath4 Chapter12
PDF
No ratings yet
EngMath4 Chapter12
39 pages
03 Iterative Methods PDF
PDF
No ratings yet
03 Iterative Methods PDF
19 pages
Golden Search and Quadratic Estimation
PDF
No ratings yet
Golden Search and Quadratic Estimation
8 pages
Power Method: Eigenvalue & Eigenvector: MPHYCC-05 Unit-IV Semester-II
PDF
No ratings yet
Power Method: Eigenvalue & Eigenvector: MPHYCC-05 Unit-IV Semester-II
9 pages
Design of Experiments Workshop
PDF
No ratings yet
Design of Experiments Workshop
13 pages
Gauss Seidel Method
PDF
No ratings yet
Gauss Seidel Method
4 pages
Extrapolation
PDF
No ratings yet
Extrapolation
3 pages
Applications of Linear Systems in Economics
PDF
No ratings yet
Applications of Linear Systems in Economics
3 pages
07 08 Simpson 3-8 Integration Method
PDF
No ratings yet
07 08 Simpson 3-8 Integration Method
10 pages
Regula Falsi Method Theory
PDF
100% (1)
Regula Falsi Method Theory
3 pages
Gauss Seidel Method
PDF
No ratings yet
Gauss Seidel Method
2 pages
Q2 Ans
PDF
No ratings yet
Q2 Ans
13 pages
Chap 4 Act
PDF
No ratings yet
Chap 4 Act
49 pages
Group 4 - Summary PSM
PDF
No ratings yet
Group 4 - Summary PSM
33 pages
ILL Conditioned Systems
PDF
No ratings yet
ILL Conditioned Systems
5 pages
Spectral Methods PDF
PDF
100% (1)
Spectral Methods PDF
28 pages
Appendix B Hand Out Gauss Newton Derivation
PDF
No ratings yet
Appendix B Hand Out Gauss Newton Derivation
8 pages
Bisection Method Algo & Examples
PDF
No ratings yet
Bisection Method Algo & Examples
9 pages
Download full Solutions Manual to accompany Design And Analysis Of Experiments 6th edition 9780471487357 (PDF) with all chapters
PDF
100% (4)
Download full Solutions Manual to accompany Design And Analysis Of Experiments 6th edition 9780471487357 (PDF) with all chapters
40 pages
2020-2021 EDA 101 Handout
PDF
No ratings yet
2020-2021 EDA 101 Handout
192 pages
Chapter 3 Laplace Transform
PDF
100% (1)
Chapter 3 Laplace Transform
32 pages
07 Method of Variation of Parameters
PDF
No ratings yet
07 Method of Variation of Parameters
37 pages
Jurnal Metode Numerik
PDF
No ratings yet
Jurnal Metode Numerik
8 pages
Homework 5 Solution
PDF
No ratings yet
Homework 5 Solution
7 pages
Various Fuzzy Numbers and Their Various Ranking Approaches
PDF
No ratings yet
Various Fuzzy Numbers and Their Various Ranking Approaches
10 pages
Ashenafi Agizaw
PDF
No ratings yet
Ashenafi Agizaw
103 pages
Numerical Integration
PDF
No ratings yet
Numerical Integration
6 pages
Syntax SAS Untuk Metode Fungsi Transfer Multi Input Dengan Deteksi Outlier
PDF
No ratings yet
Syntax SAS Untuk Metode Fungsi Transfer Multi Input Dengan Deteksi Outlier
7 pages
Chapter 7 Dimensional Analysis Modeling and Similitude PDF
PDF
100% (1)
Chapter 7 Dimensional Analysis Modeling and Similitude PDF
22 pages
Activity GDP2
PDF
No ratings yet
Activity GDP2
1 page
Standard Normal (Z) Table Area Between 0 and Z
PDF
No ratings yet
Standard Normal (Z) Table Area Between 0 and Z
1 page
Tutorial 8 Solution
PDF
No ratings yet
Tutorial 8 Solution
14 pages
Nilai Harapan Dan MGF Bersamaa
PDF
No ratings yet
Nilai Harapan Dan MGF Bersamaa
38 pages
Pole Placement1
PDF
100% (2)
Pole Placement1
46 pages
Bisection Method
PDF
No ratings yet
Bisection Method
13 pages
Numerical Linear Algebra University of Edinburgh Past Paper 2020-2021
PDF
100% (1)
Numerical Linear Algebra University of Edinburgh Past Paper 2020-2021
4 pages
MExer01 Final
PDF
No ratings yet
MExer01 Final
11 pages
Least Square Fit
PDF
100% (2)
Least Square Fit
16 pages
Daftar Nilai PTS Gasal 2020.2021
PDF
No ratings yet
Daftar Nilai PTS Gasal 2020.2021
162 pages
Simpson's 1-3rd Rule
PDF
No ratings yet
Simpson's 1-3rd Rule
4 pages
Soal Statistika
PDF
No ratings yet
Soal Statistika
8 pages
Suggested Solution To Assignment 1 - MATH4220
PDF
No ratings yet
Suggested Solution To Assignment 1 - MATH4220
10 pages
Approximation
PDF
No ratings yet
Approximation
14 pages
HW3 Sol PDF
PDF
No ratings yet
HW3 Sol PDF
7 pages
Newton's Divided Difference Interpolation Formula
PDF
No ratings yet
Newton's Divided Difference Interpolation Formula
31 pages
NumericalMethods G
PDF
No ratings yet
NumericalMethods G
39 pages
7.06 Method of Eigenfunction Expansions
PDF
No ratings yet
7.06 Method of Eigenfunction Expansions
7 pages
Lecture 11
PDF
No ratings yet
Lecture 11
37 pages
Lecture 11. Double Integrals
PDF
No ratings yet
Lecture 11. Double Integrals
8 pages
Gaussian Elimination
PDF
No ratings yet
Gaussian Elimination
8 pages
Homework 1 3
PDF
100% (2)
Homework 1 3
6 pages
Contoh Menggunakan Minitab 17 (Bartlett-Tukey-Anova)
PDF
No ratings yet
Contoh Menggunakan Minitab 17 (Bartlett-Tukey-Anova)
26 pages
Laporan Praktikum StatDas2020 - Siti Rubi'Ah (G1B019069) FISIKA A
PDF
No ratings yet
Laporan Praktikum StatDas2020 - Siti Rubi'Ah (G1B019069) FISIKA A
59 pages
Chapter 9
PDF
No ratings yet
Chapter 9
14 pages
1 Mathematical Preliminaries 2
PDF
No ratings yet
1 Mathematical Preliminaries 2
17 pages