Mathematical Optimization For Student
Mathematical Optimization For Student
Armin Darmawan
2
01 Introduction
3
Introduction
Basic concept
Maximizing or minimizing some function relative to some set, often
representing a range of choices available in a certain situation. The
function allows comparison of the different choices for determining
which might be “best.”
4
Efficient vs Effective
Effecttive I II
6
Basic concept
Objective function Max, Min Profit, cost, time, resources
Decision variables X1, X2, …
Constraints No or With constraint (>=, <=, =, >, <, ….)
7
Basic concept
Example:
• Kasus 1 : Seorang mahasiswa harus menempuh perjalanan jarak jauh dari
rumah ke kampus setiap hari. Ada beberapa cara yang dapat digunakan untuk
sampai ke kampus.
Permasalahan : Cara manakah yang paling efektif ?
Example:
• PT XYZ menghasilkan 10 jenis produk menggunakan fasilitas produksi yang
sama. Produk dihasilkan secara bergantian. Fasilitas dioperasikan 8 jam setiap
harinya dan 6 hari dalam seminggu. Setiap tanggal 1, fasilitas dibersihkan
untuk perawatan. Biaya produksi setiap jenis produk berbeda, demikian pula
harga jualnya. Semua produk menggunakan bahan baku yang hampir sama.
• Permasalahan kasus 2 : Berapa unit masing-masing produk dihasilkan untuk
mendapatkan keuntungan maksimum ?
8
Basic concept
Manfaat Optimasi
• Pengurangan biaya: Mengoptimalkan proses produksi, pengelolaan
inventaris, dan pemanfaatan sumber daya untuk mengurangi biaya.
• Peningkatan efisiensi: Memaksimalkan hasil produksi dengan
meminimalkan pemborosan waktu, energi, atau material.
• Peningkatan kualitas: Mengoptimalkan desain produk dan proses untuk
menghasilkan produk yang lebih berkualitas dengan biaya yang lebih
rendah.
• Penyelesaian masalah yang kompleks: Menyelesaikan masalah alokasi
sumber daya terbatas, perencanaan produksi, dan penjadwalan yang sering
kali melibatkan banyak variabel dan kendala.
9
Basic concept
Beberapa peran utamanya antara lain:
• Manufaktur dan Produksi: Optimasi dalam penjadwalan produksi untuk
memaksimalkan penggunaan mesin dan tenaga kerja, serta meminimalkan
waktu dan biaya produksi.
• Rantai Pasokan (Supply Chain): Optimasi dalam pengelolaan persediaan,
pengaturan pengiriman barang, dan pemilihan lokasi fasilitas produksi atau
distribusi untuk meminimalkan biaya dan waktu pengiriman.
• Desain Sistem dan Proses: Menggunakan optimasi untuk merancang
sistem atau proses yang efisien, baik dalam aspek teknis maupun ekonomi.
• Pengelolaan Sumber Daya: Memaksimalkan pemanfaatan sumber daya
yang terbatas, seperti bahan baku, tenaga kerja, atau mesin, dalam proses
produksi.
10
Basic concept
Penguasaan matematika optimasi sangat penting bagi seorang
profesional di bidang Teknik Industri karena:
• Memberikan dasar teoritis dalam pengambilan keputusan berbasis data
dan analisis matematis.
• Memungkinkan pemodelan masalah-masalah kompleks di industri
yang melibatkan banyak variabel dan pembatas.
• Memberikan alat untuk merancang solusi yang efisien dan menghemat
sumber daya, baik dalam hal waktu, tenaga kerja, maupun biaya.
• Meningkatkan daya saing perusahaan dengan mengoptimalkan proses
dan meminimalkan pemborosan.
11
Basic concept Observation
Flowchart Problem
M anage me nt definition
science
te chnique s
Model
construction
Fe e dback
Solution
Information
Implementation
12
02 Functions
13
Mathematical
Mathematical Function
Objective function f(x)
Decision variables (x1, x2, …, xn)
Constraint
Methods in optimization:
• Graphical Methods – need reliable software
02
• Analytics – mathematical classic theory → solved by calculus
(differential equation)
• Numerical Methods – if the case study have big challenges (difficult to
be derived)
14
Mathematical
Mathematical Function
Mathematical function types in optimization:
In solving the mathematical optimization, there are three types of functions: 1.
Linier and Non-linier, 2.Unimodal and multimodal, 3. Convex and concave
Linier Convex
Concave Up
Function Unimodal
02
Concave Down
Non-linier
Multimodal
15
Mathematical
Mathematical Function
• Linier and Non-linier
Linier function → ax+b (Polinom berderajat 1)
Kuadratic function → x2 (Polinom berderajat 2) (non linear)
Cubic function → x3 (Polinom berderajat 3) (non linear)
Exponential function → ax (non linear)
Hyperbolic function → a/x (non linier)
02
Case:
1. 2x-3 → Linier
2. X2+3 → Kuadratic (non linier)
3. 3/x → hyperbolic (non linier)
16
Mathematical
Mathematical Function
• Linier and Non-linier
02
Exponensial Hyperbolic 17
Mathematical
Mathematical Function
Non-linier:
• Unimodal (hanya 1 titik ekstrim) and multimodal (lebih dari 1
titik ekstrim)
02
18
Mathematical
Mathematical Function
Non-linier → Unimodal
Convex (U) untuk minimasi and Concave (∩) adalah maksimasi
Strictly convex (V) and concave (A)
02
19
Mathematical
Mathematical Function
Non-linier → Unimodal
Example:
1. f(x) = 3x2 → f’(x) = 6x → f”(x) = 6 lebih besar dr 0 → strictly convex
20
Mathematical
Mathematical Function
Non-linear → Unimodal
In the range: -2<= x <=2
Example:
1. f(x)=2x3+x2-10x → f’(x) = 6x2+2x-10 → f”(x)=12x+2, masukkan angka range batas bawah
dan batas atas maka didapatkan -2 dan 2 → non convex (indefinite)
02
22
Mathematical
Mathematical Modelling
Gradient
y2 − y1
m=
x2 − x1
y2
y1
dy y
03 =
x1 x2 dx x
Straight line
23
Mathematical
Mathematical Modelling
Differential Equation
f ( x + h) f ( x + h) − f ( x )
dy y lim
= = lim h →0 h
dx x h →0
f ( x)
f ( x) = x n , f '( x) = ......?
f ( x + h) n − x n
x x+h f '( x) = lim
h →0 h
03 h
f '( x) = lim
x n + nx n −1h + ... + h n − x n
h →0 h
24
Ordinary Differential Equations
• Where do ODEs arise?
• Notation and Definitions
• Solution methods for 1st order ODEs
1. f ( x) = x − 5 x + 2 x − 7 x + 11
6 4 3 6. f ( x) = 2
2x − 3
2. f ( x) = 5 x + 2 x − 3 x + 2 x
6 4 2
7. f ( x) = (3x 2 − 2)6
3. f ( x) = (4 x − 5 x + 1)( x + 2)
2
x −4
2
03 8. f ( x) =
4. f ( x) = (3 x 2 − 5 x)( x 3 + 2) x
( x − 2 x + 5)
2
9. f ( x) = 3 (6 x + 2) 2
5. f ( x) = 2
( x + 2 x − 3) 10. f ( x) = e12 x − 7
29
Order
• The order of a differential equation is just the order of
highest derivative used.
dy
If y is derived to variable x = 5x + 3
dx
d 2 y dy
2
+ =0 2nd order
dx dx
.
dx d 3x
=x 3 3rd order
dy dy
Beberapa literatur lain menuliskan turunan pertama, kedua, ketiga dan seterusnya
sebagai: y’, y’’, y’’’, y(n) atau f’(x), f”(x), dst.
Degree
• The degree is the highest power of the order of highest
derivative used.
dy
If y is derived to variable x = 5x + 3
dx
d 2 y dy
2
+ =0 2nd order, 1 degree
dx dx
.
dx d 3x
=x 3 3rd order, 1 degree
dy dy
Linearity
• The important issue is how the unknown y appears in the equation. A
linear equation involves the dependent variable (y) and its derivatives
by themselves. There must be no "unusual" nonlinear functions of y or
its derivatives.
• A linear equation must have constant coefficients, or coefficients
which depend on the independent variable (t). If y or its derivatives
appear in the coefficient the equation is non-linear.
Linearity - Examples
dy
+ y = 0 is linear
dt
dx
+ x 2 = 0 is non-linear
dt
dy 2
+t = 0 is linear
dt
dy 2
y + t = 0 is non-linear
dt
Linearity – Summary
Linear Non-linear
2y y2 or sin( y )
dy dy
y
dt dt
(2 + 3 sin t) y (2 − 3 y 2 ) y
2
dy dy
t
dt dt
Mathematical Opt
04 (Optimization Classical Theory) to
solve Non Linier Programming (NLP)
35
Mathematical
Optimization Classic Theory
Graphic
Non Linear Programming
Analysis Problem (NLPP)
Numeric
36
Noted
Optimization is not roots finding
Maximum
Inflection point
roots
Minimum
37
Mathematical
Optimization Classic Theory
NLPP without constraint
1 variable
1 variable
Example:
f(x) = 12x5-45x4+40x3+5
1. Max or min of the function?
a. Derive the first order of the function (Necessary conditions)
04 f’(x)=60x4-180x3+120x2
f’(x)=60x2(x2-3x+2)
f’(x)=60x2(x-1)(x-2)
Stationer point→ x=0, x=1, x=2
39
Mathematical Optimization Classic Theory
NLPP without constraint
1 variable
Example:
f(x) = 12x5-45x4+40x3+5
1. Max or min of the function?
a. Sufficient condition
f’'(x)=60x4-180x3+120x2
f’’(x)=240x3-540x2+240x
For the specific case, we need to derive again when
f’’(1)=-60, x=1 maximize initial function f(x)
04 f (1) = 12
f’’(x*) = 0. If the result show equal f’’(x*) = 0 the
derive it again and when the result fn(x*) is not equal
to zero, then please check when n = even (fn(x*)>0 →
f’’(2)=240, x=2 minimize initial function f(x) x* minimize f(x), fn(x*)<0 → x* maximaize f(x).
However, when n = odd, x* is not max/min but can be
f (2) = -11 defined saddle point or inflection point.
f’’(0)=0
f 3(x) = 720x2-1080x+240=60(12x2-18x+4)
f 3(0) = 240 (n = odd, the result > 0, indicating saddle point/inflection point)
40
Mathematical Optimization Classic Theory
NLPP without constraint
1 variable
Example:
f(x) = 12x5-45x4+40x3+5
Maximum
Inflection point
04 Minimum
Source: https://www.wolframalpha.com/
41
Mathematical Optimization Classic Theory
NLPP without constraint
1 variable
Example:
f(x) = 12x5-45x4+40x3+5
%matlab code
clear
clc
syms x
fplot(@(x)(12*(x.^5) -45*(x.^4)+40*(x.^3)+5), [-0.5 2.4])
04 xline (0); xline (1); xline (2);
yline (0);
yline (12);
title(''); ylabel('f(x)'); xlabel('x');
42
Mathematical Optimization Classic Theory
NLPP without constraint
Multivariable
For the specific case, we need to derive again when f’’(x*) = 0. If the result show equal f’’(x*) = 0 the derive it again and when the result fn(x*)
is not equal to zero, then please check when n = even (fn(x*)>0 → x* minimize f(x), fn(x*)<0 → x* maximaize f(x). However, when n = odd, x*
is not max/min but can be defined saddle point or inflection point.
43
Mathematical Optimization Classic Theory
NLPP without constraint
Multivariable
Example:
f ( x1 , x2 ) = 3 x12 + 2 x22 + 4 x1 x2 − 6 x1 − 8 x2 + 6
Multivariable
Multivariable
Multivariable
f ( x1 , x2 ) = 3 x + 2 x + 4 x1 x2 − 6 x1 − 8 x2 + 6
2
1
2
2
6 x1 + 4 x2 − 6
f ( x ) =
1
4 x + 4 x2 − 8
6 4
H = , calc. determinant, det H1 = | h11 |= 6 0
04 4 4
h11 h12
det H 2 = det = (6 4) − (4 4) = 8 0
h21 h22
Matriks Hessian → definite positive, thus x*[-1 3] minimal point
Then, input [-1 3] to the initial function and we will get f(-1, 3) = -3
47
Mathematical Optimization Classic Theory
NLPP without constraint f ( x1 , x2 ) = 3 x12 + 2 x22 + 4 x1 x2 − 6 x1 − 8 x2 + 6
Multivariable
f ( x, y ) = 3 x 2 + 2 y 2 + 4 xy − 6 x − 8 y + 6
04
contour(X,Y,Z); 48
Mathematical Optimization Classic Theory
NLPP without constraint
Multivariable
f ( x, y ) = 3 x 2 + 2 y 2 + 4 xy − 6 x − 8 y + 6
04
49
Mathematical Optimization Classic Theory
NLPP without constraint
Multivariable
f ( x, y ) = 3 x 2 + 2 y 2 + 4 xy − 6 x − 8 y + 6
%matlab code
%multivariable1
xl=-10; xu=10;
yl=-40; yu=40;
04 x = linspace (xl,xu);
y = linspace (yl,yu);
[X, Y]=meshgrid (x, y);
Z=(3.*X.^2)+(2.*Y.^2)+(4.*X.*Y)-(6.*X)-(8.*Y)+6;
surf (X, Y, Z);%untuk tambahkan counter ubah surf jadi surfc
xlabel ('x'); ylabel ('y'); zlabel('z');
set (gca, 'FontSize', 16, 'LineWidth', 2);
50
Mathematical Optimization Classic Theory
NLPP without constraint
Multivariable
So, from the previous example. The procedure to solve NLPP without constraint for multivariable:
1. Define the function f(x1,x2,…,xn)
2. Find the stationary point f = 0, f = 0
x1 x2
2 f 2 f
3. Find the Hessian matrix x 2
x1 x2 h11 h12
H = 2 =
1
f
2 f h21 h22
03
x1 x2
x12
4. Find the principal minor of H h11 h12 h13
h11 h12
1 =| h11 |; 2 = det ; = det h21 h22 h23
h21 h22 3 h
31 h32 h33
5. If Δ1, Δ 2 and Δ 3 are all positive then we get minimal at a stationary point
If Δ 1 is negative, Δ 2 is positive, and Δ 3 is negative then we get maxima at a stationary point
51
Mathematical Optimization Classic Theory
NLPP without constraint
Multivariable
Practice:
1. Find the relative maximum or minimum of the function
f ( x1 , x2 , x3 ) = x + x + x − 4 x1 − 8 x2 − 12 x3 + 100
2
1
2
2
2
3
2. Optimize:
04 f ( x1 , x2 , x3 ) = x + x + x − 6 x1 − 8 x2 − 10 x3
2 2 2
1 2 3
Group homework:
Mencari dan menyelesaikan masing-masing 5 persoalan praktis (studi kasus) untuk NLPP
without constraint yang single variable (1 soal) maupun multivariable (4 soal).
Buat di powerpoint. Noted untuk formulasi matematis gunakan mathtype software.
52
Mathematical
Optimization Classic Theory
NLPP without constraint
Multivariable
Function
Partial derivatives
04
Global minimum
53
Mathematical
Optimization Classic Theory
NLPP without constraint
Multivariable
Function
Partial derivatives
04
Global minimum
54
Optimization Classic Theory
NLPP with constraint
With Linear Equality
(Lagrangian Multiplier)
Lagrange
56
Optimization Classic Theory
NLPP with constraint
With Linear Equality
(Lagrangian Multiplier)
Lagrange
Case study
A manufacture industry tried to minimize cost due to complicated
economic condition. Currently, total cost (TC) function of the
manufacturer:
TC = 5 x − 2 x + 6 y + 4.5 y − xy
2 2
57
Optimization Classic Theory
TC = 5 x 2 − 2 x + 6 y 2 + 4.5 y − xy + (2 x + 3 y − 15)
= 5 x 2 − 2 x + 6 y 2 + 4.5 y − xy + 2 x + 3 y − 15
dTC
= 10 x − 2 − y + 2 = 0 (1)
dx
04 dTC
= 12 y + 4.5 − x + 3 = 0 (2)
dy
dTC (3)
= 2 x + 3 y − 15 = 0
d
58
Optimization Classic Theory
10 x − y − 2.0 + 2 = 0 | 3 | 30 x − 3 y − 6.0 + 6 = 0
04 − x + 12 y + 4.5 + 3 = 0 | 2 | −2 x + 24 y + 9.0 + 6 = 0 (-)
32 x − 27 y − 15 + 0.0 = 0 (4)
59
Optimization Classic Theory
2 x + 3 y − 15 = 0 | 16 | 32 x + 48 y − 240 = 0
32 x + 27 y − 15 = 0 | 1| 32 x + 27 y − 15 = 0 (-)
75y − 225 = 0
04 75 y = 225
2 x + 3 y − 15 = 0
y = 225 / 75 = 3 2 x + 3(3) − 15 = 0
(substitute y=3 to (3))
2x + 9 − 15 = 0
x=6/2=3
60
Optimization Classic Theory
x = 3, y = 3
04 TC = 5 x 2 − 2 x + 6 y 2 + 4.5 y − xy
TC = 5(32 ) − 2(3) + 6(32 ) + 4.5(3) − (3 3)
TC = 97.5
61
Mathematical
Optimization Classic Theory
NLPP with constraint
With Linear Equality
(Lagrangian Multiplier)
62
05 Numerical Methods
63
Mathematical
Optimization Classic Theory
Graphic
Non Linear Programming
Analysis Problem (NLPP)
Numeric
NLPP without constraint NLPP with constraint
64
Mathematical
Optimization Classic Theory
Newton – Raphson
f(x)
The Newton-Raphson method is based on the
principle that if the initial guess of the root
of f (x) = 0 is at xi , then if one draws the
f(xi) [xi+2, f(xi)]
tangent to the curve at f (xi) , the point xi+1
where the tangent crosses the x-axis is an
improved estimate of the root (Figure 1).
f '( xi ) = tan
f ( xi ) − 0 f(xi+1)
04 = ,
xi − xi +1
θ
which gives xi+2 xi+1 xi
f ( xi )
xi +1 = xi − Geometric Illustration of the Newton-Raphson
f '( xi ) Method
65
Mathematical Optimization Classic Theory
Newton – Raphson
The steps of the Newton-Raphson method to find the root of an equation f(x)=0
are:
1. Evaluate f(x) symbolically
2. Use an initial guess of the root, xi, to estimate the new value of the root, xi+1, as
f ( xi )
xi +1 = xi −
f '( xi )
04 3. Find the absolute relative approximation error |Ꜫa| as
xi +1 − xi
a =
xi +1
4. Compare the absolute relative approximation error with the pre-specified relative error
tolerance, |Ꜫa|. If |Ꜫa|> Ꜫs, then go to Step 2, else stop the algorithm. Also, check if the number
of iterations has exceeded the maximum number of iterations allowed. If so, one needs to
terminate the algorithm and notify the user.
66
Mathematical
Optimization Classic Theory
Newton – Raphson xi +1 = xi −
f ( xi )
Example: f '( xi )
x0 (initial) 1 1 5 0,8
f(x) = + 2x – 2
x3 1 0,8 0,112 3,92 0,771429
04 f’(x) = 3x2+2 2 0,771429 0,001936 3,785306 0,770917
Let i=0 → x0=1, thus f(x0)=1, f’(x0)=5 3 0,770917 6,05E-07 3,78294 0,770917
4 0,770917 5,91E-14 3,782939 0,770917
x1=1-(1/5)=0.8, thus f(x1)=?, f’(x1)=? 5 0,770917 0 3,782939 0,770917
1 variable 68
Mathematical Optimization Classic Theory
Newton – Raphson
73
Mathematical Optimization Classic Theory
Golden Section
Case study: Find the minimum of the following function between x = 0 and x = 4,
f(x) = x2 – 6x + 15
74
Mathematical Optimization Classic Theory
Golden Section
Case study: Find the minimum of the following function between x = 0 and x = 4,
f(x) = x2 – 6x + 15
75
Mathematical Optimization Classic Theory
Golden Section
Case study: Find the minimum of the following function between x = 0 and x = 4,
f(x) = x2 – 6x + 15
76
Mathematical Optimization Classic Theory
Golden Section
Case study: Find the minimum of the following function between x = 0 and x = 4,
f(x) = x2 – 6x + 15
77
Mathematical Optimization Classic Theory
Golden Section
Case study: Find the minimum of the following function between x = 0 and x = 4,
f(x) = x2 – 6x + 15
78
Mathematical Optimization Classic Theory
Golden Section
Case study: Find the minimum of the following function between x = 0 and x = 4,
f(x) = x2 – 6x + 15 fplot(@(x)((x.^2)-(6.*x)+15), [0.0 10.0]);
79
Mathematical Optimization Classic Theory
Golden Section
Case study: Find the minimum of the following function between x = 0 and x = 4,
f(x) = x2 – 6x + 15 A B C D E F G H
Iteration a b d x1 x2 f(x1) f(x2)
1 0 10 6,18034 6,18034 3,81966 16,11456 6,671843
2 0 6,18034 3,81966 3,81966 2,36068 6,671843 6,40873
3 0 3,81966 2,36068 2,36068 1,45898 6,40873 8,374742
4 1,45898 3,81966 1,45898 2,917961 2,36068 6,00673 6,40873
5 2,36068 3,81966 0,9017 3,262379 2,917961 6,068843 6,00673
6 2,36068 3,262379 0,557281 2,917961 2,705098 6,00673 6,086967
7 2,705098 3,262379 0,344419 3,049517 2,917961 6,002452 6,00673
8 2,917961 3,262379 0,212862 3,130823 3,049517 6,017115 6,002452
9 2,917961 3,130823 0,131556 3,049517 2,999267 6,002452 6,000001
04 10 2,917961 3,049517 0,081306 2,999267 2,968211 6,000001 6,001011
11 2,968211 3,049517 0,05025 3,018461 2,999267 6,000341 6,000001
12 2,968211 3,018461 0,031056 2,999267 2,987405 6,000001 6,000159
13 2,987405 3,018461 0,019194 3,006598 2,999267 6,000044 6,000001
14 2,987405 3,006598 0,011862 2,999267 2,994736 6,000001 6,000028
15 2,994736 3,006598 0,007331 3,002067 2,999267 6,000004 6,000001
16 2,994736 3,002067 0,004531 2,999267 2,997536 6,000001 6,000006
17 2,997536 3,002067 0,0028 3,000337 2,999267 6 6,000001
18 2,999267 3,002067 0,001731 3,000998 3,000337 6,000001 6
19 2,999267 3,000998 0,00107 3,000337 2,999928 6 6
80
20 2,999267 3,000337 0,000661 2,999928 2,999676 6 6
Mathematical
Optimization Classic Theory
Golden Section
Practice
f(x)=20x-2x2+10
04
81
Mathematical
Optimization Classic Theory
Steepest Ascent/Descent
This is an iterative method, also known as the Gradient Ascent/Descent Method
Steepest ascent → for maximum function; Steepest descent → for minimum function
Perform a series of transformation steps to change a multivariable function into a single variable function
based on the gradient of the search direction. This optimum search step is then carried out iteratively
until the expected level of convergence is obtained.
Requires reliable software for the iteration process carried out.
Optimum search:
04 As an illustration, a function f(x,y) whose optimum (max/min) point will be determined, with initial
values: x = x0 and y = y0, then in the first iteration step, the new x and y values can be determined by:
f f
x = x0 + h and y = y0 + h
x x0 , y0 y x , y
0 0
f f
with and are partial derivatives of function f ( x, y ), each to x and y.
NLPP without constraint x y
multivariable 82
Mathematical
Optimization Classic Theory
Steepest Ascent/Descent
In this case, the vector gradient function can be formulated as:
f f
f = i+ j
x y
Thus, we can observe that the function with two variables in x, and y,
04 f(x,y), are transformed into the function with one variable in h, g(h).
The value of x and y can be obtained iteratively, then become x0 and y0
for the next iteration. Continue the iteration until convergent.
NLPP without constraint
multivariable 83
Mathematical
Optimization Classic Theory
Steepest Ascent
Example:
Use the steepest ascent to determine the optimal (maximum) point
from the function below, with the initial value of x and y: x0 = -1,
and y0 = 1
f(x, y)=2xy + 2x - x2 - 2y2
04 Steps:
1. Determine the partial derivative of the function. The partial derivative can
be derived as:
f f
= 2 y + 2 − 2 x and = 2x − 4 y
NLPP without constraint
x y
multivariable 84
Mathematical
Optimization Classic Theory
Steepest Ascent
Steps: f(x, y)=2xy + 2x - x2 - 2y2
2. For the first iteration with the initial value of x and y: x0 = -1, and y0 = 1,
substitute the value into the initial function: f(x, y)= 2(-1)(1)+2(-1)-(-1)2-2(1)2=-7
3. Evaluate the value from the partial derivative function at x0 and y0
f f
= 2(1) + 2 − 2(−1) = 6 and = 2(−1) − 4(1) = −6
x y
04 x0 , y0 x0 , y0
f f
Thus, the gradient vector can be written as : f = i+ j = 6 i + (−6 j )
x y
f f
x = x0 + h = −1 + 6h and y = y0 + h = 1 + ( −6 h )
x x0 , y0 y x0 , y0
04 6
7
1,98
1,99
1,00
0,99
0,0480 -0,0480
0,0096 0,0096
0,20
1,00
1,99
2,00
0,99
1,00
1,9994
1,9999
8 2,00 1,00 0,0096 -0,0096 0,20 2,00 1,00 2,0000
9 2,00 1,00 0,0019 0,0019 1,00 2,00 1,00 2,0000
04
04
Practice:
f(x)=(1/3)x2+(1/3)y2+(1/5)xy
with the initial value of x and y
: x0 = -4, and y0 = 4
04
04
Group homework:
04 Mencari dan menyelesaikan masing-masing 5 persoalan praktis
(studi kasus) untuk NLPP dengan metode numeric diatas (untuk
masing-masing numeric approach).
Buat di powerpoint. Noted untuk formulasi matematis gunakan
mathtype software.
94
Mathematical
Optimization Theory
• ….
• Bayesian
• Quadratic Programming
• Sequential Quadratic Programming (SQP)
• Etc…
04 Modern method optimization:
❑ Genetic Algorithm
❑ Simulated Annealing
❑ Particle Swarm Optimization
❑ Ant Colony Optimization
❑ Optimization of Fuzzy Systems
❑ Neural Network Based Optimization
❑ Bee Colony Optimization
95
06 Example research
96
Adopting SQP (Sequential Quadratic Programming) to
Publications solve the non-linear mathematical optimization model.
Wu, Chien-Wei, Darmawan, A., & Liu, Shih-Wen*. (2023). Stage-independent multiple sampling plan by variables
inspection for lot determination based on the process capability index Cpk. International Journal of Production Research,
61(10), 3171-3183. https://doi.org/10.1080/00207543.2022.2078745 (SCI & Scopus indexed)
Link pdf: https://drive.google.com/file/d/1-cLO66_Ld6DQdFXu5tfi1DlPDSAoX7jy/view?usp=sharing
Wu, Chien-Wei*, Darmawan, A. (2023). A Modified Sampling Scheme for Lot Sentencing Based on the Third-Generation
Capability Index. Annals of Operations Research. https://doi.org/10.1007/s10479-023-05328-z (SCI & Scopus indexed)
Link pdf: https://drive.google.com/file/d/1t3wytmZoESKNRJRjEOf2wTK1byit-GhZ/view?usp=sharing
Wu, Chien-Wei*, Darmawan, A., & Liu, Shih-Wen. (2024). Developing a Stage-Independent Multiple Sampling Plan
(SIMSP) Loss-based Capability Index for Lot Disposition. (Journal of Operation Research Society) (SCI & Scopus indexed)
Link: https://www.tandfonline.com/doi/abs/10.1080/01605682.2024.2363264
Darmawan, A., Wu, C. W., Wang, Z. H., & Chiang, P. J. (2024). Developing variables two-plan sampling scheme with
consideration of process loss for lot sentencing. Quality Engineering, 37(2), 273–291.
https://doi.org/10.1080/08982112.2024.2381012
97
Thank you
98