E7 2021 Lecture26
E7 2021 Lecture26
Fall 2021
Instructor: Shaofan Li
Final Exam Logistics
10. Johannes Yu’s Office Hour: Friday (Dec. 10th) 12pm – 1pm
https://berkeley.zoom.us/j/97283953545
11. German Perea Lopez’s Office Hour: Monday (Dec. 6th) 10am -12:pm
https://berkeley.zoom.us/j/94001979907
Meeting ID: 940 0197 9907
GSI RR Week Office Hour Schedule
12. Gaofeng Su’s Office Hour: Monday (Dec. 6th), 12 pm-2 pm.
https://berkeley.zoom.us/j/3358509866
Session 1. Caglar Tamur: Monday (Dec.6th) 1:00 pm -2:30 pm 502 Davis Hall;
Session 4. Caglar Tamur: Wednesday (Dec.8th) 3:30 pm -5.00 pm 502 Davis Hall;
Final Exam Coverage
Before midterm: 40% and After Midterm 60 %
1. Root finding:
Bisection method and Newton-Raphson method (single variable only).
Their convergence rates, advantages and shortcomings;2.
2 Linear algebra:
Vector norms; Matrix multiplication; Linear independence, rank of a matrix;
Linear system of equations: under-determined, over-determined, the condition that the linear
system has a unique solution, and pseudoinverse for over-determined system (no derivations)
3. Regression
Two set data regression only. Know how the pseudo-inverse of an over-determined system
is derived from the least-square minimization, but not need to memorizing derivation.
Know how to use polyfit. (No nonlinear regression).
4. Interpolation
Linear interpolation, polynomial interpolation, and Lagragian interpolation
(no cubic interpolation). Know how to use polyval.
6. Gradient descent
Formulas of gradient descent and ascent and their meanings.
= = x=A\y
x=pinv(A)*y
‘Pseudoinverse’
= x=inv(A)*y
Overdetermined
No solution
𝑟𝑎𝑛𝑘 𝐴 = 𝑛 2) Unique Solution
𝑟𝑎𝑛𝑘 𝐴, 𝑦 = 𝑟𝑎𝑛𝑘(𝐴)
x=pinv(A)*y
Underdetermined ‘ Pseudoinverse’
How will we test linear algebra in final exam ?
Independent columns
Independent rows
This is how shall we test linear algebra in final exam !
Gradient Descent
• An optimization algorithm to find the minimum of an objective function.
• Applicable to many engineering problems
• Most used optimization algorithm in Machine Learning!
– Used to choose model parameters by minimizing the cost (error) function
Objective (loss) function 𝑓 𝑥, 𝑦 decreases fastest in the direction of its gradient.
Start from an initial guess and move in the opposite direction of the gradient.
f f
= 2 xy 2 = 2(2)(1) 2 = 4 = 2 x 2 y = 2(2) 2 (1) = 8
x y
f = 4i + 8 j
How to choose learning rate ?
Example
Determine the minimum of the function
f ( x, y ) = x 2 + y 2 + 2 x + 4
Use the point (2,1) as the initial estimate of the optimal solution.
Solution: Initial point (2,1) 𝑥Ԧ𝑛+1 = 𝑥Ԧ𝑛 − 𝛾 ∇𝑓(𝑥𝑛 )
Iteration 1: To calculate the gradient; the partial derivatives
must be evaluated as
f f
= 2 x + 2 = 2(2) + 2 = 4 = 2 y = 2(1) = 2
x y
f = 4i + 2 j
Now the function f ( x, y ) can be expressed along the direction of gradient as
𝜕𝑓 𝜕𝑓
𝑓 𝑥0 − ℎ, 𝑦0 − ℎ = 𝑓(2 − 4ℎ, 1 − 2ℎ) = (2 − 4ℎ)2 + (1 − 2ℎ)2 + 2(2 − 4ℎ) + 4
𝜕𝑥 𝜕𝑦
𝑔 ℎ = 20ℎ2 − 28ℎ + 13
Iteration 1 continued:
This is a simple function and it is easy to determine
by taking the first derivative and solving for its roots. ℎ∗ = 0.7
x = 2 + 4(−0.7) = −0.8
y = 1 + 2(−0.7) = −0.4
f = 0.4i − 0.8 j
𝜕𝑓 𝜕𝑓
𝑓 𝑥0 − ℎ, 𝑦0 − ℎ = 𝑓(−0.8 − 0.4ℎ, −0.4 + 0.8ℎ) = (−0.8 − 0.4ℎ)2 + (−0.4 + 0.8ℎ)2 + 2(−0.8 − 0.4ℎ) + 4
𝜕𝑥 𝜕𝑦
𝑥 = −0.8 − 0.4(0.5) = −1
𝑦 = −0.4 − 0.8(0.5) = 0
f f
= 2 x + 2 = 2(−1) + 2 = 0 = 2 y = 2(0) = 0
x y
f = 0i + 0 j
yi+1= yi + f(xi,yi) h
A) Euler method
B) Predictor-corrector method (Heun)
C) Midpoint method
D) Simpson’s method
E) I’m not sure
Simpson’s Rule
Approximate the area by fitting quadratic polynomials through 3 consecutive data points
ℎ ℎ ℎ
𝑦𝑖+1 = 𝑦𝑖 + (𝑘1 + 2𝑘2 + 2𝑘3 + 𝑘4 )
6
𝑥0 𝑥1 𝑥2
To approximate the integral over (a,b), divide the interval into segments of equal width
Derivation of the Fourth-Order Runge-Kutta (RK4) Method
We use Simpson’s 1/3 rule to integrate this equation:
𝑘1 = 𝑓(𝑡𝑖 , 𝑦𝑖 )
Shaofan Li
University of California at Berkeley
Your pain has a purpose. Your problems, struggles, headaches, and hassles
cooperate towards one end --- mastery of computer programming, numerical analysis,
and a letter Grade A in this class.
Mathematic Principles of
Equilibrium
Subjects that we shall learn: Truss Structure
Machines
Examples of Structures
The subjects that we shall learn: Strains
Question ?