0% found this document useful (0 votes)
39 views37 pages

E7 2021 Lecture26

This 4th order Runge-Kutta formula is equivalent to the classical 4th order Runge-Kutta method. The classical 4th order Runge-Kutta method evaluates the slope f(x,y) at the initial point, midpoint, and end point of each interval to obtain a higher order approximation of the solution over the interval.

Uploaded by

kong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views37 pages

E7 2021 Lecture26

This 4th order Runge-Kutta formula is equivalent to the classical 4th order Runge-Kutta method. The classical 4th order Runge-Kutta method evaluates the slope f(x,y) at the initial point, midpoint, and end point of each interval to obtain a higher order approximation of the solution over the interval.

Uploaded by

kong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

E7 – Lecture Series

Lecture 26 Final Review

Fall 2021
Instructor: Shaofan Li
Final Exam Logistics

• Examination Time: Thursday (December 16th) afternoon: 3:00pm – 6:00pm;


• Students with DSP from 3:00 pm – 9:00 pm;
• There will be no make-up exam;
• The exam will be conducted on zoom;
• 50 quiz questions;
• There will be five zoom sessions:
Session 1: Lab 013 + Lab014
Session 2: Lab 016 + Lab017
Session 3: Lab018 + Lab019 + Lab020
Session 4: Lab021 + Lab022
Session 5: Students with disabilities
My Office Hour

Monday (Dec. 6th) 5:30pm-7:30pm


Wednesday(Dec. 8th) 5:30pm-7:30pm

Zoom link is available in Bcourses


GSI RR Week Office Hour Schedule

1. Qi Zheng’s Office Hour: Monday (Dec. 6th), 3 pm-5 pm.


https://berkeley.zoom.us/j/94116411601

2. Dennis Chiu’s Office Hour; Wednesday (Dec. 8th ) 10 am – 12pm.


Zoom link is https://berkeley.zoom.us/j/3838592491.

3. Zhiyong Jiang’s Office Hour: Monday (Dec. 6th), 4 pm-6pm.


https://berkeley.zoom.us/j/97960411999?pwd=dVJIU2ExNjlJWDA3
STFOMXl6amF2Zz09.

4. Chao Wang’s Office Hour: Thursday (Dec. 9th) 4pm-6pm


https://berkeley.zoom.us/j/95603951439

5. Sania Khan’s Office Hour: Tuesday (Dec. 7th) 4pm – 6pm


https://berkeley.zoom.us/j/97283953545
GSI RR Week Office Hour Schedule

6. Abigayle Hodson’s Office Hour: Tuesday (Dec. 7th), 12 pm-2 pm.


https://berkeley.zoom.us/j/91609267790?pwd=bUxSMGd5S1I1Z3BUR3J
NMnpVRWtaZz09

7. Qijun Chen’s Office Hour; Wednesday (Dec. 8th ) 3pm – 5pm.


https://berkeley.zoom.us/j/2584194988

8. Sasha Frondeville’s Office Hour: Tuesday (Dec. 7th), 10am-12pm.


https://berkeley.zoom.us/j/97494497379?pwd=N0pucm1xUU5pNWhqcEo
3K0puTlFFUT09

9. Candace Yee’s Office Hour: Monday (Dec. 6th) 12pm-2pm


https://berkeley.zoom.us/j/3358509866

10. Johannes Yu’s Office Hour: Friday (Dec. 10th) 12pm – 1pm
https://berkeley.zoom.us/j/97283953545

11. German Perea Lopez’s Office Hour: Monday (Dec. 6th) 10am -12:pm
https://berkeley.zoom.us/j/94001979907
Meeting ID: 940 0197 9907
GSI RR Week Office Hour Schedule

12. Gaofeng Su’s Office Hour: Monday (Dec. 6th), 12 pm-2 pm.
https://berkeley.zoom.us/j/3358509866

RRR Week Problem Solving Session Schedule

Session 1. Caglar Tamur: Monday (Dec.6th) 1:00 pm -2:30 pm 502 Davis Hall;

Session 2. Johannes Yu : Monday (Dec.6th) 3:30 pm -5.00 pm 502 Davis Hall;

Session 3. Shaofan Li : Wednesday (Dec.8th) 1:00 pm -3:00 pm 502 Davis Hall;

Session 4. Caglar Tamur: Wednesday (Dec.8th) 3:30 pm -5.00 pm 502 Davis Hall;
Final Exam Coverage
Before midterm: 40% and After Midterm 60 %

1. Root finding:
Bisection method and Newton-Raphson method (single variable only).
Their convergence rates, advantages and shortcomings;2.
2 Linear algebra:
Vector norms; Matrix multiplication; Linear independence, rank of a matrix;
Linear system of equations: under-determined, over-determined, the condition that the linear
system has a unique solution, and pseudoinverse for over-determined system (no derivations)

3. Regression
Two set data regression only. Know how the pseudo-inverse of an over-determined system
is derived from the least-square minimization, but not need to memorizing derivation.
Know how to use polyfit. (No nonlinear regression).

4. Interpolation
Linear interpolation, polynomial interpolation, and Lagragian interpolation
(no cubic interpolation). Know how to use polyval.
6. Gradient descent
Formulas of gradient descent and ascent and their meanings.

7. Finite Different Methods


Forward difference, backward difference, and central difference algorithms,
and their error estimates expressed by using Big O notation;

8. Numerical Integration Methods:


Left and right Rieman methods, midpoint rule, Trapezoidal rule, and Simpson’s 1/3 rule;
Their error estimates in terms of Big-O notation.
Know the derivations of Left and right Rieman and midpoint rule.
Simple mean Monto-Carlo integration algorithm and its error estimate in terms of
Big-O notation (No derivation).

9. Numerical methods for ODE


Forward Euler(explicit), Backward Euler (Implicit), Central difference, the second-order
and the fourth-order Runge-Kutta methods, and their convergence rates in terms of Big-O notation.
Know how to use ode45.
Review Previous Lesson Before Class Starts

𝑨𝒖𝒈𝒎𝒆𝒏𝒕𝒆𝒅 𝑴𝒂𝒕𝒓𝒊𝒙 𝑨𝒎×𝒏 𝒙𝒏×𝟏 = 𝒚𝒎×𝟏


‘Matrix A concatenated with vector y’

𝑟𝑎𝑛𝑘 𝐴, 𝑦 ≠ 𝑟𝑎𝑛𝑘(𝐴) 1) No Solution

= = x=A\y
x=pinv(A)*y
‘Pseudoinverse’
= x=inv(A)*y
Overdetermined
No solution
𝑟𝑎𝑛𝑘 𝐴 = 𝑛 2) Unique Solution
𝑟𝑎𝑛𝑘 𝐴, 𝑦 = 𝑟𝑎𝑛𝑘(𝐴)

𝑟𝑎𝑛𝑘 𝐴 < 𝑛 3) Infinitely Many


Solutions
= x=A\y Gives ‘a solution’

x=pinv(A)*y
Underdetermined ‘ Pseudoinverse’
How will we test linear algebra in final exam ?
Independent columns
Independent rows
This is how shall we test linear algebra in final exam !
Gradient Descent
• An optimization algorithm to find the minimum of an objective function.
• Applicable to many engineering problems
• Most used optimization algorithm in Machine Learning!
– Used to choose model parameters by minimizing the cost (error) function
Objective (loss) function 𝑓 𝑥, 𝑦 decreases fastest in the direction of its gradient.
Start from an initial guess and move in the opposite direction of the gradient.

𝑥Ԧ𝑛+1 = 𝑥Ԧ𝑛 − 𝛾 ∇𝑓(𝑥𝑛 )

∇𝑓 𝑥 = Grad of the objective function


𝛾 = Learning Rate (step size)
Multivariable Calculus Revisited: Gradient
f f
f = i+ j
x y
Example: Calculate the gradient to determine the direction of the steepest slope
at point (2, 1) for the function f ( x, y ) = x 2 y 2

Solution: To calculate the gradient we would need to calculate

f f
= 2 xy 2 = 2(2)(1) 2 = 4 = 2 x 2 y = 2(2) 2 (1) = 8
x y

which are used to determine the gradient at point (2,1) as

f = 4i + 8 j
How to choose learning rate ?

Example
Determine the minimum of the function

f ( x, y ) = x 2 + y 2 + 2 x + 4
Use the point (2,1) as the initial estimate of the optimal solution.
Solution: Initial point (2,1) 𝑥Ԧ𝑛+1 = 𝑥Ԧ𝑛 − 𝛾 ∇𝑓(𝑥𝑛 )
Iteration 1: To calculate the gradient; the partial derivatives
must be evaluated as
f f
= 2 x + 2 = 2(2) + 2 = 4 = 2 y = 2(1) = 2
x y

f = 4i + 2 j
Now the function f ( x, y ) can be expressed along the direction of gradient as

𝜕𝑓 𝜕𝑓
𝑓 𝑥0 − ℎ, 𝑦0 − ℎ = 𝑓(2 − 4ℎ, 1 − 2ℎ) = (2 − 4ℎ)2 + (1 − 2ℎ)2 + 2(2 − 4ℎ) + 4
𝜕𝑥 𝜕𝑦

𝑔 ℎ = 20ℎ2 − 28ℎ + 13
Iteration 1 continued:
This is a simple function and it is easy to determine
by taking the first derivative and solving for its roots. ℎ∗ = 0.7

This means that traveling a step size of ℎ = 0.7 along the


negative gradient reaches a minimum value for the
function in this direction. These values are substituted back
to calculate a new value for x and y as follows:

x = 2 + 4(−0.7) = −0.8
y = 1 + 2(−0.7) = −0.4

Note that f (2,1) = 13 f (− 0.8,−0.4 ) = 3.2


Iteration 2: The new initial point is (− 0.8,−0.4) .We calculate
the gradient at this point as
f f
= 2 x + 2 = 2(−0.8) + 2 = 0.4 = 2 y = 2(−0.4) = −0.8
x y

f = 0.4i − 0.8 j
𝜕𝑓 𝜕𝑓
𝑓 𝑥0 − ℎ, 𝑦0 − ℎ = 𝑓(−0.8 − 0.4ℎ, −0.4 + 0.8ℎ) = (−0.8 − 0.4ℎ)2 + (−0.4 + 0.8ℎ)2 + 2(−0.8 − 0.4ℎ) + 4
𝜕𝑥 𝜕𝑦

𝑔 ℎ = 0.8ℎ2 − 0.8ℎ + 3.2

𝑥 = −0.8 − 0.4(0.5) = −1
𝑦 = −0.4 − 0.8(0.5) = 0

f (− 0.8,−0.4 ) = 3.2 f (− 1,0 ) = 3


Iteration 3: The new initial point is (− 1,0 ) .We calculate the
gradient at this point as

f f
= 2 x + 2 = 2(−1) + 2 = 0 = 2 y = 2(0) = 0
x y

f = 0i + 0 j

This indicates that the current location is a local optimum


along this gradient and no improvement can be gained by
moving in any direction. The minimum of the function is
at point (-1,0).
Euler’s method
Where is it assumed that the slope f(yi,xi) is
valid for Euler’s method?

yi+1= yi + f(xi,yi) h

A) At the initial point


B) At the end point
C)At the mid point
D)Over the whole interval
Consider this 4th order Runge-Kutta formula

What method that you learned is this equivalent to?

A) Euler method
B) Predictor-corrector method (Heun)
C) Midpoint method
D) Simpson’s method
E) I’m not sure
Simpson’s Rule
Approximate the area by fitting quadratic polynomials through 3 consecutive data points

ℎ ℎ ℎ
𝑦𝑖+1 = 𝑦𝑖 + (𝑘1 + 2𝑘2 + 2𝑘3 + 𝑘4 )
6
𝑥0 𝑥1 𝑥2

To approximate the integral over (a,b), divide the interval into segments of equal width
Derivation of the Fourth-Order Runge-Kutta (RK4) Method
We use Simpson’s 1/3 rule to integrate this equation:

𝑘1 = 𝑓(𝑡𝑖 , 𝑦𝑖 )

ℎ 𝑘2 = 𝑓(𝑡𝑖 + ℎ/2 , 𝑦𝑖 + ℎ/2 𝑘1 )


𝑦𝑖+1 = 𝑦𝑖 + (𝑘1 + 2𝑘2 + 2𝑘3 + 𝑘4 ) 𝑘3 = 𝑓(𝑡𝑖 + ℎ/2 , 𝑦𝑖 + ℎ/2 𝑘2 )
6
𝑘4 = 𝑓(𝑡𝑖 + ℎ , 𝑦𝑖 + ℎ 𝑘3 )
Welcome to C30/ME85 Spring 2022
Introduction to Solid Mechanics

Shaofan Li
University of California at Berkeley
Your pain has a purpose. Your problems, struggles, headaches, and hassles
cooperate towards one end --- mastery of computer programming, numerical analysis,
and a letter Grade A in this class.
Mathematic Principles of
Equilibrium
Subjects that we shall learn: Truss Structure
Machines
Examples of Structures
The subjects that we shall learn: Strains
Question ?

You might also like