0% found this document useful (0 votes)
29 views45 pages

Topic4-Monte Carlo Simulation

This document discusses Monte Carlo simulation techniques for generating random samples and evaluating probabilities, integrals, and reliability. It covers topics such as inverse transformation, acceptance-rejection sampling, Markov chain Monte Carlo, and using Monte Carlo simulation for reliability analysis. Examples are provided to illustrate key concepts and Excel implementations.

Uploaded by

Cho Wing So
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views45 pages

Topic4-Monte Carlo Simulation

This document discusses Monte Carlo simulation techniques for generating random samples and evaluating probabilities, integrals, and reliability. It covers topics such as inverse transformation, acceptance-rejection sampling, Markov chain Monte Carlo, and using Monte Carlo simulation for reliability analysis. Examples are provided to illustrate key concepts and Excel implementations.

Uploaded by

Cho Wing So
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

CIEM5810

Engineering Risk, Reliability and Decision


Instructor: Prof. Anthony Leung

Topic 4: Monte Carlo simulation


Introduction

Let 𝑦 = 𝑔 𝒙 denote a performance function, where 𝒙 is a vector denoting the


random variable. To evaluate the probability of 𝑦<0, a procedure may be:

1. Draw 𝑛 samples of 𝒙 based on its PDF, denote as 𝑥1 , 𝑥2 , …, 𝑥𝑛


2. Evaluate the value of 𝑦 at these points, obtaining 𝑛 samples of 𝑦
3. Suppose out of 𝑛 samples of 𝑦, there are 𝑛𝑓 samples with value less than zero.
The probability of 𝑦<0 is then 𝑛𝑓 /𝑛

This is called Monte Carlo Simulation (MCS) – powerful for reliability analysis.
The key is how to draw samples of 𝒙 based on its PDF.

A key limitation of MCS is that it may require a large number of function


evaluations of 𝑔 𝒙 , hence computationally expensive. This can be improved by
importance sampling
Monte Carlo Simulation

Generation of random variables

1. Inverse transformation algorithm – a CDF-based method


2. Acceptance-rejection algorithm – a PDF-based method
3. Markov Chain Monte Carlo to aid the A-R algorithm
Inverse transformation

Let:

1. 𝐹 𝑥 denote the cumulative density function (CDF) of 𝑥.


2. 𝑢 is uniformly distributed between 0 and 1.
3. 𝑢𝑖 denote a sample of 𝑢 and 𝑥𝑖 denote a sample of 𝑥
𝒙𝒊 = 𝑭−𝟏 𝒖𝒊

Limitation:
It is a CDF-based method;
when 𝒙𝒊 = 𝑭−𝟏 𝒖𝒊 is difficult to
calculate, it is hard to implement
Example 1

Suppose the crack width of soil follows the exponential distribution with a CDF:
𝐹𝑋 𝑥 = 1 − exp −𝜆𝑥

where 𝜆 = 0.2 mm-1. Generate the samples 𝑥 using the inverse transformation

Excel implementation

Solution: 1. Generate 10 random


samples of 𝑢 by rand()
𝑢 = 𝐹𝑋 𝑥 = 1 − exp −𝜆𝑥
𝑥 = 𝐹𝑋−1 𝑢 = − ln 1 − 𝑢 /𝜆 2. Calculate 10 samples
of 𝑥 by the equation
Example 1

Results

Crack width
Example 2

10 samples are generated for a normal distribution with 𝜇=10 and 𝜎=5

1. Generate 10 random samples


of 𝑢 by rand()

2. Calculate 10 samples of 𝑥 by
the excel function norminv,
which is the inverse of the
CDF of normal distribution
Example 3 – WeChat Red Packet
Example 3 – WeChat Red Packet

Send CNY 200 to 8 friends; assume everyone’s red packet money follows the
same probability distribution. Let 𝑀𝑖 be the red packet obtained by the i-th friend
8

෍ 𝑀𝑖 = 200
𝑖=1

Solution 1: Take 𝑢𝑖 as the proportion of pocket money


Example 3 – WeChat Red Packet

Send CNY 200 to 8 friends; assume everyone’s red packet money follows the
same probability distribution. Let 𝑀𝑖 be the red packet obtained by the i-th friend
8

෍ 𝑀𝑖 = 200
𝑖=1

Solution 2: Assume 𝑀𝑖 follows LN, mean = 25, cov = 0.3


Acceptance-rejection method (A-R)

A PDF-based method

Let 𝑓𝑋 𝑥 and 𝑠𝑋 𝑥 denote two PDFs where samples of 𝑠𝑋 𝑥 can be generated


conveniently. The A-R algorithm can be proceeded as follows:

1. Generate a sample 𝑥* from 𝑠𝑋 𝑥


2. Draw a sample of 𝑢, where 𝑢 is a uniformly distributed between 0 and 1; if 𝒖 is
smaller than 𝒇𝑿 𝒙∗ / 𝒄 ∙ 𝒔𝑿 𝒙∗ , accept 𝒙* as a sample of 𝒇𝑿 𝒙
3. Go to step 1 until enough samples are drawn

Note: 𝒄 should satisfy the following inequality: 𝒄 ≥ 𝒎𝒂𝒙 𝒇 𝒙 /𝒔 𝒙


Acceptance-rejection method (A-R)

Proof:
𝑃 𝑎𝑐𝑐𝑒𝑝𝑡|𝑥 𝑃 𝑥
Based on the Bayes’ theorem: 𝑃 𝑥|𝑎𝑐𝑐𝑒𝑝𝑡 =
𝑃 𝑎𝑐𝑐𝑒𝑝𝑡

𝑓𝑋 𝑥
By A-R theorem, the probability of 𝑥 being accepted: 𝑃 𝑎𝑐𝑐𝑒𝑝𝑡|𝑥 =
𝑐𝑠𝑋 𝑥

Since 𝑥 is a sample of 𝑠𝑋 𝑥 , the chance of obtaining 𝑥 is: 𝑃 𝑥 = 𝑠𝑋 𝑥


Based on the total probability theorem:
𝑓𝑋 𝑥 1 1
𝑃 𝑎𝑐𝑐𝑒𝑝𝑡 = න 𝑃 𝑎𝑐𝑐𝑒𝑝𝑡|𝑥 𝑃 𝑥 𝑑𝑥 = න 𝑠𝑋 𝑥 𝑑𝑥 = න 𝑓𝑋 𝑥 𝑑𝑥 =
𝑐𝑠𝑋 𝑥 𝑐 𝑐
Thus, samples generated based on A-R
Hence, 𝑃 𝑥|𝑎𝑐𝑐𝑒𝑝𝑡 = 𝑓𝑋 𝑥 algorithm are samples of PDF of 𝑓𝑋 𝑥
Example 4

The cohesion of a soil (𝑥) sometimes can be modelled as a Beta random variable
with the following PDF:
𝑥−𝑎 𝑞−1 𝑏−𝑎 𝑟−1
𝑓𝑋 𝑥 = , where 𝑎=10, 𝑏=20, 𝑞=6 and 𝑟=2, and the beta function:
𝐵 𝑞,𝑟 𝑏−𝑎 𝑞+𝑟−1
1
𝐵 𝑞, 𝑟 = න 𝑥 𝑞−1 1 − 𝑥 𝑟−1 𝑑𝑥
0

In excel, the PDF of beta function can be evaluated using betadist(x,q,r,0,a,b)

Generate the samples of the cohesion using the A-R algorithm


Example 4

(Smartly) choose a sampling function 𝑠𝑋 𝑥


1
𝑠𝑋 𝑥 = , 𝑎≤𝑥≤𝑏
𝑏−𝑎

𝑓𝑋 𝑥
The figure below shows the ; the maximum value is smaller than 3; 𝑐 can be
𝑠𝑋 𝑥
adopted as 3 to ensure the inequality 𝒄 ≥ 𝒎𝒂𝒙 𝒇 𝒙 /𝒔 𝒙

By Matlab
Markov Chain Monte Carlo (MCMC)

In the A-R algorithm, the determination of the constant 𝑐 may be difficult

MCMC offers an alternative to generate a sequence of samples one by one


without such a constant, by constructing a Markov chain that has the target
distribution as its equilibrium distribution

Various algorithms exist for MCMC, among which the widely-used Metropolis-
Hastings algorithm is introduced.

Let 𝑓 𝑥 be the target PDF and 𝑔 𝑥 be a proposal PDF that suggests a


candidate for the next sample. A symmetric proposal PDF, i.e. 𝑔 𝑥|𝑦 = 𝑔 𝑦|𝑥 ,
such as a uniform or a normal distribution, is normally a good choice.
Markov Chain Monte Carlo (MCMC)

The Metropolis-Hastings algorithm can be implemented as follows:

1. Provide the initial sample, 𝑋𝑘


2. Generate a candidate, 𝑋𝑐 , for the next sample from the proposal PDF, 𝑔 𝑋𝑐 |𝑋𝑘
3. Calculate the acceptance ratio:
𝑓 𝑋𝑐 𝑔 𝑋𝑘 |𝑋𝑐 𝑓 𝑋𝑐
𝛼= =
𝑓 𝑋𝑘 𝑔 𝑋𝑐 |𝑋𝑘 𝑓 𝑋𝑘

4. Generate 𝑈~𝑈(0,1); if 𝑈 ≤ 𝛼, accept the candidate by setting 𝑋𝑘+1 = 𝑋𝑐 ;


otherwise, reject the candidate and set 𝑋𝑘+1 = 𝑋𝑘 instead.
5. Repeat steps (2) – (4) until sufficient samples are drawn

Compared with the A-R algorithm that abandons partial samples, all samples in
MCMC will be used.
Revisit of Example 3

Excel implementation
Monte Carlo Simulation

Generation of random vectors


Vectors with independent variables

When 𝑋1 , 𝑋2 , …, 𝑋𝑛 are statistically independent (SI), the following relationship


holds:
𝑁

𝑓𝑋1 , 𝑋2 , …, 𝑋𝑛 𝑥1 , 𝑥2 , …, 𝑥𝑛 = ෑ 𝑓𝑋1 𝑥𝑖
𝑖=1

The above equation shows that the join PDF of SI variables is the product of
the PDF of these variables.

In such a case, we can generate the samples for each variable; the collection
of these samples constitutes the samples of the random variables 𝑋1 , 𝑋2 , …, 𝑋𝑛
Example 5

Suppose the cohesion (𝑋1 ) and friction angle (𝑋2 ) of a soil both follow the
normal distribution with 𝜇𝑋1 = 20 kPa, 𝜎𝑋1 = 5 kPa, 𝜇𝑋2 = 30o, 𝜎𝑋2 = 6o. Suppose 𝑋1
and 𝑋2 are SI. Generate the samples of 𝑋1 and 𝑋2 .

Matlab implementation
Example 5

Excel implementation
Vectors with correlated variables

When 𝑋1 , 𝑋2 , …, 𝑋𝑛 are not SI, 𝑋1 , 𝑋2 , …, 𝑋𝑛 can be transformed into ሼ𝑍1 , 𝑍2 , …,


Revisit: eigenvalue and eigenvector

Basics
Revisit: eigenvalue and eigenvector

Excel implementation

1. Open ‘matrix.xla’ & Enable Macros

2. Add-ins  Macros  Eigen-solving


Revisit: eigenvalue and eigenvector

Excel implementation

3. Calculate eigenvalues 4. Calculate eigenvectors


Revisit of Example 4

Suppose the cohesion (𝑋1 ) and friction angle (𝑋2 ) of a soil both follow the
normal distribution with 𝜇𝑋1 = 20 kPa, 𝜎𝑋1 = 5 kPa, 𝜇𝑋2 = 30o, 𝜎𝑋2 = 6o. Suppose 𝑋1
and 𝑋2 have a correlation coefficient of -0.5. Generate the samples of 𝑋1 and 𝑋2 .

Matlab implementation
Monte Carlo Simulation

Reliability analysis with MCS


Average of a function

Let 𝑓 𝑥 denote a PDF of 𝑥. 𝑎 𝑥 denotes a function of 𝑥. The mean of 𝑎 𝑥 can


be written as follows:

𝐸 𝑎 = න 𝑎 𝑥 𝑓 𝑥 𝑑𝑥 = 𝐸 𝑎 = න ℎ 𝑥 𝑑𝑥

when ℎ 𝑥 = 𝑎 𝑥 𝑓(𝑥). This shows that an integral can also be considered as the
average of a function

Hence the average of the function can be evaluated by:


𝑛
1
න 𝑎 𝑥 𝑓 𝑥 𝑑𝑥 = 𝐸 𝑎 ≈ ෍ 𝑎 𝑥𝑖
𝑛
𝑖=1
Example 6

Suppose 𝑥 follows a uniform distribution between 0 and 10, and 𝑎 𝑥 = 𝑥 2 .


Evaluate the mean of 𝑎 𝑥 based on Excel.
Example 7

Suppose 𝑥1 ~𝑁(0,2) and 𝑥2 ~𝑁 2,1 . Let 𝑎 𝑥1 , 𝑥2 = 𝑥1 𝑥2 + 𝑥1 . Evaluate the mean


of 𝑎 𝑥1 , 𝑥2 based on Excel. Assume 𝑥1 and 𝑥2 are s.i.
Integration with Monte Carlo simulation

Considering the following equation:

𝐻 = න ℎ 𝑥 𝑑𝑥

Let 𝑠𝑋 𝑥 denote a sampling PDF. 𝐻 can also be written as:


ℎ 𝑥 ℎ 𝑥 ℎ 𝑥
𝐻= ‫𝑥 𝑠׬‬ 𝑠𝑋 𝑥 𝑑𝑥 = 𝐸 ; therefore 𝐻 is the average value of
𝑋 𝑠𝑋 𝑥 𝑠𝑋 𝑥

Let 𝑥𝑖 denote a sample from 𝑠𝑋 𝑥 . 𝐻 can be approximately as:

1 𝑁 ℎ 𝑥𝑖 1 1 𝑁 ℎ 𝑥𝑖 2
෡=
𝐻 σ The variance of 𝐻: 𝑉𝑎𝑟 𝐻
෡ = σ ෡2
−𝐻
𝑁 𝑖=1 𝑠𝑋 𝑥𝑖 𝑁 𝑁 𝑖=1 𝑠𝑋 𝑥𝑖
Example 8

Evaluate the following the integral based on MCS:


2
𝑥
𝐻=න 2
𝑑𝑥
0 1+𝑥

Let the sampling function be 𝑠 𝑥 be a uniform distribution between 0 and 1. In


such as case, ℎ 𝑥 and 𝑠 𝑥 can be written as:
𝑥
ℎ 𝑥 = 𝑠 𝑥 = 1, 0 < 𝑥 < 1
1+𝑥 2

Excel implementation
Failure probability estimation with MCS

Failure probability is the integration of the PDF in the failure domain:

1 if 𝒙 is in the failure regions


𝑝𝑓 = ‫𝑥𝑑 𝑥 𝑓 𝑥 𝐼 ׭ … ׬‬, where 𝐼 𝑥 = ቊ
0 if 𝒙 is not in the failure regions

Based on the Monte Carlo Simulation, the failure probability can be estimated:

1 𝑁 ෞ𝑓
1−𝑝
𝑝
ෞ𝑓 ≈ σ 𝐼 𝑥𝑖 𝑐𝑜𝑣 𝑝
ෞ𝑓 ≈
𝑁 𝑖=1 ෞ𝑓
𝑁𝑝

Statistical error
Example 9

Consider 𝑔 𝑥 = 𝑥1 𝑥2 − 𝑥3 , where 𝑥1 ~𝑁 10,2 , 𝑥2 ~𝑁 3,2 and 𝑥3 ~𝑁 25,6

Suppose the random variables are statistically independent, evaluate 𝒑𝒇

𝒑𝒇 is sensitive to sample no.


COV reduces with the sample no.
Required number of samples

Recall that the COV of 𝒑𝒇 estimated based on the MCS depends on the number
of samples:

ෞ𝑓
1−𝑝
𝑐𝑜𝑣 𝑝
ෞ𝑓 ≈ ෞ𝑓
𝑁𝑝

Based on this, the following relationship for determining 𝑵 exists


Example 10

Shallow foundation problem

Recalling the Terzaghi’s bearing capacity theory: 𝑞𝑢𝑙𝑡 = 0.5𝛾𝑠 𝐵𝑁𝛾 + 𝑐 ′ 𝑁𝑐 + 𝛾𝑠 𝐷𝑓 𝑁𝑞


𝜋 ∅′
where 𝑁𝛾 = 1.8 𝑁𝑞 − 1 tan ∅′, 𝑁𝑞 = tan2 + 𝑒 𝜋 tan ∅′ , 𝑁𝑐 = 𝑁𝑞 − 1 cot ∅′
4 2

Suppose:

𝑐~𝑁 20,5 , 𝜙~𝑁 30,6 , 𝜌𝑐𝜙 = −0.5

𝐷𝑓 = 0.5 m; 𝛾1 = 20 𝑘𝑁/𝑚3 , 𝐵 = 2 m, 𝑞= 500 kN/m2

What is the reliability of the shallow foundation?


Example 10

Excel implementation
Monte Carlo Simulation

Importance sampling
Importance sampling
1 1 𝑁 ℎ 𝑥𝑖 2
Based on: 𝑉𝑎𝑟 𝐻
෡ = σ ෡2
−𝐻
𝑁 𝑁 𝑖=1 𝑠𝑋 𝑥𝑖

Accuracy of MCS depends on the sampling function 𝑠𝑋 𝑥 . Increase the


accuracy of MCS through an efficient sampling function

Rubinstein (1981) showed that the optimal sampling function that can minimise
𝑉𝑎𝑟 𝐻෡ is:

𝑠𝑋 𝑥 = 𝑘|ℎ 𝑥 |

where 𝑘 is a constant defined as: ‫| ׬‬ℎ 𝑥 |𝑑𝑥

In practice, one can choose a sampling function with a shape close to ℎ 𝑥 to


minimise the sampling error: importance sampling method (Shinozuka, 1983)
Application in calculating 𝑝𝑓

Applying importance sampling method into the failure probability calculation:


𝑓𝑋 𝑥
𝑝𝑓 = ‫𝑥𝑑 𝑥 𝑋𝑠 𝑥 𝑤 𝑥 𝐼 ׬‬ where 𝑤 𝑥 =
𝑠𝑋 𝑥

The estimated failure probability and its variance is:

1 𝑁 1 1 𝑁 2
𝑝
ෞ𝑓 ≈ σ 𝐼 𝑥𝑖 𝑤 𝑥𝑖 𝑣𝑎𝑟 𝑝
ෞ𝑓 ≈ σ 𝐼 𝑥𝑖 𝑤 𝑥𝑖 ෞ𝑓 2
−𝑝
𝑁 𝑖=1 𝑁 𝑁 𝑖=1

Harbitz (1986) suggests that one can first find the design point and then
centre the sampling function at the design point with its covariance matrix
being the same as that of the uncertain variables

Therefore, 𝑠𝑋 𝑥 can be a multivariate normal distribution with a mean of 𝑥𝐷 and


covariance matrix of 𝐶
Example 11

Consider 𝑔 𝑥 = 𝑥1 𝑥2 − 𝑥3 , where 𝑥1 ~𝑁 10,2 , 𝑥2 ~𝑁 3,2 and 𝑥3 ~𝑁 25,6 .


Suppose the random variables are statistically independent.

AFORM analysis shows that its design point (or MPP; c.f. Topic 3, slides 10 - 26) is

𝑥𝐷 = 9.888, 2.568, 25.393

Evaluate 𝒑𝒇 based on the importance sampling method.

Again, solve by excel implementation


Example 12

Calculate the failure probability of the shallow foundation in Example 10 using


Harbitz (1986)’s Importance Sampling method

Step 1: Find the design point (or MPP) by AFORM


Example 12

Step 2: Determine the failure probability

Based on 1000
samples, the 𝒑𝒇 is
5.94% with a cov of
4.35%.

More accurate than


the conventional
MCS when the same
sample no. of 1000
are used
End of lecture note

You might also like