0% found this document useful (0 votes)
44 views

1sensitivity Lecture Slides

1) The document discusses global sensitivity analysis (GSA) to understand how input parameters affect model outputs and their interactions. 2) GSA can provide insights into influential parameters, enable dimensionality reduction, and inform decision making. 3) Local sensitivity analysis examines impact of small input changes but GSA explores the entire parameter space using variance-based Sobol indices. 4) Sobol indices decompose the total output variance into fractions attributed to individual parameters or their interactions.

Uploaded by

Khaled Hamdaoui
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views

1sensitivity Lecture Slides

1) The document discusses global sensitivity analysis (GSA) to understand how input parameters affect model outputs and their interactions. 2) GSA can provide insights into influential parameters, enable dimensionality reduction, and inform decision making. 3) Local sensitivity analysis examines impact of small input changes but GSA explores the entire parameter space using variance-based Sobol indices. 4) Sobol indices decompose the total output variance into fractions attributed to individual parameters or their interactions.

Uploaded by

Khaled Hamdaoui
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 59

Global Sensitivity Analysis

Motivation
• Consider a numerical model of a structure
Inputs Model Output
𝐴

𝐸
… midspan
displacement
𝑓𝑦

Failure index (0,1)
𝑃2

• Will all the input parameters contribute to the response?


• Which input factors are more influential than others?

2
Applications of GSA
• To gain insights
• How different parameters and their interactions affect a system

The parameters related to the local site diminution effect (…)


can be in general neglected; (…) temporal envelope function
parameters have a considerable contribution towards the total
risk only for lower moment magnitudes, especially (…)
(Vetter & Taflanidis, 2011) 3
Applications of GSA
• To gain insights
• How different parameters and their interactions affect a system

• Dimensionality reduction
• By identifying uninfluential (redundant) factors

• Informed decision making


• To find parameters for which new data acquisition reduces target
uncertainty the most
• To identify most effective decision options

• Model diagnostics
• After developing a model, one may compare GSA results
with expert knowledge
4
Local Sensitivity Analysis 𝜕𝑔(𝑿)
𝜕𝑥𝑖
• Rate of change (slope)

𝑥∗
𝜕𝑔(𝑿)
𝑆𝑖𝐷 𝑿 =
𝜕𝑋𝑖

• Studies the impact of small perturbations on the model outputs


• Evaluated at a reference point 𝑿 e.g. structural reliability analysis

• One-factor-at-a-time evaluation
• Used in reliability analysis / optimization

optimization

5
𝜕𝑔(𝑿)
Local Sensitivity Analysis 𝜕𝑥𝑖

• How to explore entire variability space?


𝑥∗

Is the average of gradients


a good measure?
• Consider an example
𝑌 = 𝑔(𝑋1 , 𝑋2 ) = 2𝑋1 + 𝑋2
𝑋1 ~𝑁 0,0.52 ,𝑋2 ~𝑁 0,52

Among 𝑋1 and 𝑋2 , which variable is more “important”?

6
Local Sensitivity Analysis
𝑌 = 𝑔(𝑋1 , 𝑋2 ) = 2𝑋1 + 𝑋2
𝑋1 ~𝑁 0,0.52 , 𝑋2 ~𝑁 0,52

• If we decide the importance by ‘partial derivative’ measure,


𝑋1 is important
• But if we inspect the scatter plots, 𝑋2 seem to dominate
the response

𝑌 vs. 𝑋1 𝑌 vs. 𝑋2 7
Local Sensitivity Analysis
• Sigma-normalized derivative

𝜎𝑋𝑖 𝜕𝑔(𝑿)
𝑆𝑖𝑆𝐷 𝑿 =
𝜎𝑌 𝜕𝑋𝑖

𝑋2 is five times more important than 𝑋1

𝑌 vs. 𝑋1 𝑌 vs. 𝑋2

8
Local Sensitivity Analysis
• ‘Partial derivative’ in the standard random variable domain
• When the random variables are independent, each variable can be
transformed to the standard normal variable, 𝑍𝑖 = 𝑇(𝑋𝑖 ).
• Example: FORM analysis

Importance vector:
normalized gradient
∇𝐺 𝒛∗
𝜶=−
∇𝐺 𝒛∗
Note 𝒛∗
𝜶=−
𝛽
For dependent
variables
9
Variance-based Sensitivity
• Intuition behind the Sobol indices

𝔼𝐱𝑖ҧ Y|𝑥𝑖 is almost constant 𝔼𝐱𝑗ത Y|𝑥𝑗 depends on 𝑥𝑗


throughout different 𝑥𝑖 values

𝕍ar𝑥i 𝔼𝐱𝑖ҧ Y|𝑥𝑖 is almost zero 𝕍ar𝑥𝑗 𝔼𝐱𝑗ത Y|𝑥𝑗 is larger


Low sensitivity High sensitivity
10
Variance Decomposition
• 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 is a measure of sensitivity
• The Law of Total Variance

𝕍ar Y = 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 + 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖

Explained by 𝒙𝒊 Not explained by 𝒙𝒊

i.e. the expected reduction in


variance that would be obtained if
𝑥𝑖 could be fixed 𝑌

𝑥𝑖 11
Variance Decomposition
• 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 is a measure of sensitivity
• The Law of Total Variance

𝕍ar Y = 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 + 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖

• Derivation
𝕍ar Y = 𝔼 𝑌 2 − 𝔼 𝑌 2
2
= 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 2 |𝑥𝑖 − 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌|𝑥𝑖
2
= 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖 + 𝔼𝐱𝑖ҧ 𝑌 𝑥𝑖 2 − 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖
2
2
= 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖 + 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑥𝑖 − 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌|𝑥𝑖
= 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖 + 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 12
Variance Decomposition
• 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 is a measure of sensitivity
• The Law of Total Variance

𝕍ar Y = 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 + 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖

Always greater than 0

𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖


1= +
𝕍ar Y 𝕍ar Y
Sensitivity index
In range of [0,1]

13
Variance Decomposition
• The Law of Total Variance

𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖


1= +
𝕍ar Y 𝕍ar Y

• Sobol Sensitivity Index


𝕍ar 𝔼 Y|𝑥𝑖
𝑆𝑖 =
𝕍ar Y

𝔼 𝕍ar Y|𝑥𝑖
𝑆𝑖 = 1 −
𝕍ar Y

• Main-effect index, First-order index 14


Second-order Sensitivity Measures

𝕍ar𝑥𝑖𝑥𝑗 𝔼𝐱𝑖𝑗ഥ 𝑌|𝑋𝑖 , 𝑋𝑗


𝑆𝑖𝑗 = − 𝑆𝑖 − 𝑆𝑗
𝕍ar 𝑌 contribution
of 𝑋𝑗
joint contribution contribution
of 𝑋𝑖 and 𝑋𝑗 of 𝑋𝑖
𝑋𝑖 𝑋𝑗

𝑆𝑖 𝑆𝑖𝑗 𝑆𝑗

𝑆𝑖𝑗 captures the pure interaction effect


15
Interaction Effect
• Interaction effect: 𝑋𝑖 vs. 𝑌 is affected by 𝑋𝑗

𝑌 𝑌 𝑌
(1) (1) (1)
𝑋𝑗 𝑋𝑗 𝑋𝑗

(2)
𝑋𝑗 (2)
𝑋𝑗 𝑋𝑗
(2)

𝑋𝑖 𝑋𝑖 𝑋𝑖
No interaction Interaction between 𝑋𝑖 and 𝑋𝑗
𝑆𝑖𝑗 = 0 𝑆𝑖𝑗 > 0

• Nonadditive terms create the interaction


𝑔𝐴 𝑋1 , 𝑋2 = 3𝑋13 + log 𝑋2 No interaction 𝑆12 = 0

𝑔𝐵 𝑋1 , 𝑋2 = 3𝑋13 + log 𝑋2 + 𝑋1 𝑋2 Interaction between 𝑋1 and 𝑋2


𝑆12 > 0 16
Higher-order Sensitivity Indices
When random variables are independent below holds

1 = ෍ 𝑆𝑖 + ෍ 𝑆𝑖𝑗 + … + 𝑆1,2,…,𝑑
𝑖 𝑖<𝑗

Consider an example 𝑌 = 𝑔(𝑋1 , 𝑋2 , 𝑋3 )


𝑋1
𝑆1

𝑆13

𝑆12 𝑆123 𝑆3

𝑆23 𝑋3
𝑆2

𝑋2
=1
17
Total-effect Index

𝕍ar𝑿𝑖ҧ 𝔼𝑥𝑖 𝑌|𝑿𝑖ҧ


𝑆𝑖𝑇 =1−
𝕍ar 𝑌

Conditioning on all
variables but 𝑋𝑖
𝑆𝑖𝑇 accounts for all the interaction
effects associated with a variable 𝑋𝑖

18
Total-effect Index
• For example, consider a function

𝑌 = 𝑔(𝑋1 , 𝑋2 , 𝑋3 )

Total index
Total-effect index for 𝑋1 is
𝑋1 Main index
𝑆1𝑇 = 1 − 𝑆23 − 𝑆2 − 𝑆3
𝑆1

𝑆13
When the variables are independent
𝑆12 𝑆123 𝑆3
𝑆1𝑇 = 𝑆1 + 𝑆12 + 𝑆13 + 𝑆123 𝑋3
𝑆23
𝑆2
𝑋2
19
Analysis of Variance (ANOVA) Decomposition
Consider uncorrelated 𝑿 distributed within a unit hyper-cube
𝑌 = 𝑔(𝑿)
The function can be expanded as

𝑌 = 𝑔0 + ෍ 𝑔𝑖 𝑋𝑖 + ෍ 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗 + 𝑔12,..,𝑑 𝑋1, 𝑋2, … , 𝑋𝑑


𝑖 𝑖<𝑗

This formula is called ANOVA representation if


1
න 𝑔𝒖 𝑿𝒖 𝑑𝑥𝑘 = 0, 𝑘∈𝒖
0

for any 𝒖 ⊆ {1,2, … , 𝑑}. For example,


1 1
න 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗 𝑑𝑋𝑖 = 0 and න 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗 𝑑𝑋𝑗 = 0
0 0 20
Analysis of Variance (ANOVA) Decomposition

𝑌 = 𝑔0 + ෍ 𝑔𝑖 𝑋𝑖 + ෍ 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗 + 𝑔12,..,𝑑 𝑋1, 𝑋2, … , 𝑋𝑑


𝑖 𝑖<𝑗
Taking 𝑉𝑎𝑟[∙] on both sides

𝕍ar 𝑌 = 𝑉 = ෍ 𝑉𝑖 + ෍ 𝑉𝑖𝑗 + … + 𝑉12..𝑑


𝑖 𝑖<𝑗

The proportion of variance attributed to 𝑋𝑖


𝑉𝑖 𝑉𝑖𝑗 𝑉12..𝑑
1=෍ +෍ + …+
𝑉 𝑉 𝑉
𝑖 𝑖<𝑗

Equivalent to Sobol index why?


24
Analysis of Variance (ANOVA) Decomposition

𝑌 = 𝑔0 + ෍ 𝑔𝑖 𝑋𝑖 + ෍ 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗 + 𝑔12,..,𝑑 𝑋1, 𝑋2, … , 𝑋𝑑


𝑖 𝑖<𝑗

=0
𝔼[∙] on both sides , i.e. integrate over 𝑿
𝔼 𝑌 = 𝑔0

𝔼[∙ |𝑿𝒖 ] on both sides, i.e. integrate over all but 𝒖 ⊆ 1,2, … , 𝑑
𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 = 𝑔0 + 𝑔𝑖 𝑋𝑖

𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 , 𝑋𝑗 = 𝑔0 + 𝑔𝑖 𝑋𝑖 +𝑔𝑗 𝑋𝑗 + 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗

….
25
Analysis of Variance (ANOVA) Decomposition

𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 = 𝑔0 + 𝑔𝑖 𝑋𝑖
𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 , 𝑋𝑗 = 𝑔0 + 𝑔𝑖 𝑋𝑖 +𝑔𝑗 𝑋𝑗 + 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗
….
𝕍ar𝑥𝑖 [𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 ] = 𝑉𝑖
𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 , 𝑋𝑗 = 𝑉𝑖 + 𝑉𝑗 + 𝑉𝑖𝑗
𝑋𝑖 𝑋𝑗
….
𝑆𝑖 𝑆𝑖𝑗 𝑆𝑗
𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 𝑉𝑖
= = 𝑆𝑖
𝕍ar[𝑌] 𝑉
𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 , 𝑋𝑗 𝑉𝑖 𝑉𝑗 𝑉𝑖𝑗
= + + = 𝑆𝑖 + 𝑆𝑗 + 𝑆𝑖𝑗
𝕍ar[𝑌] 𝑉 𝑉 𝑉
…. 26
ANOVA vs. The Law of Total Variance
Consider 𝑋1 ,

The Law of Total Variance

𝕍ar Y = 𝕍ar𝑥i 𝔼𝐱𝑖ҧ Y|𝑋1 + 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑋1

ANOVA 𝑑

𝕍ar 𝑌 = 𝑉1 + ෍ 𝑉𝑖 + ෍ 𝑉𝑖𝑗 + … + 𝑉12..𝑑


𝑖=2 𝑖<𝑗

= 𝕍ar𝑥i 𝔼𝐱𝑖ҧ Y|𝑋1

27
Remarks
• When variables are correlated
• The Law of Total Variance does not require the assumption of independence
• Intuitive interpretation still holds

• ANOVA requires the assumption of independence

1 > ෍ 𝑆𝑖 + ෍ 𝑆𝑖𝑗 + … + 𝑆1,2,…,𝑑


𝑖 𝑖<𝑗

28
Remarks
• When 𝑿 are independent random variables, the sensitivity indices are
invariant to any one-on-one transformation of input 𝑍𝑖 = 𝑇𝑖 (𝑋𝑖 )

𝑍𝑖 = 𝑇𝑖 (𝑋𝑖 )
𝑍1 𝑋1
𝑍2 𝑋2
𝑋3
𝑌 = 𝑔(𝑋) 𝑌 𝑌෨ = 𝑎𝑌 + 𝑏
𝑍3
𝑍4 𝑋4 Linear
transform
One-on-one
transform

• The sensitivity indices are invariant to the linear transform of output


29
30
Special case – Linear model 𝑔(𝒙)
• For a linear model, below are equivalent
• Sigma-normalized derivative
• Linear regression coefficients
• Variance-based sensitivity indices

• Example – FORM limit state surface


𝐺𝐹𝑂𝑅𝑀 𝒛 = ∇𝐺 𝒛∗ 𝒛 − 𝒛∗
𝜕𝐺 𝒛∗ 𝜕𝐺 𝒛∗
= 𝑧1 − 𝑧1∗ + ⋯+ 𝑧𝑑 − 𝑧𝑑∗
𝜕𝑧1 𝜕𝑧𝑑

𝕍ar[𝑌𝐹𝑂𝑅𝑀 ] = ∇𝐺 𝒛∗ 2

𝕍ar[𝔼 𝑌𝐹𝑂𝑅𝑀 𝑧𝑖 ]]
𝜕𝐺 𝒛∗ 𝑆𝑖 = = 𝛼2
𝔼 𝑌𝐹𝑂𝑅𝑀 𝑧𝑖 ] = 𝑧𝑖 𝕍ar 𝑌𝐹𝑂𝑅𝑀
𝜕𝑧𝑖
𝜕𝐺 𝒛∗ 2
𝕍ar[𝔼 𝑌𝐹𝑂𝑅𝑀 𝑧𝑖 ]] = 𝜕𝑧𝑖
31
32
Algorithms: (1) Monte Carlo Estimation
Requires two-fold integration for “variance” and “mean” operation

For n=1:N
(𝑛)
sample 𝑥𝑖

𝕍ar𝑥𝑖 𝔼𝒙𝑖ҧ Y|𝑥𝑖 For m=1:N


(𝑚)
𝑆𝑖 = sample 𝒙𝑖ҧ
𝕍ar Y 𝑚,𝑛 (𝑛) (𝑚)
simulate 𝑦 = 𝑔 𝑥𝑖 , 𝒙𝑖ҧ
end
𝑁
𝑛 (𝑛) 1 𝑚,𝑛
𝐸 = 𝔼𝒙𝑖ҧ Y|𝑥𝑖 ≃ ෍𝑦
𝑁
end 𝑚=1

𝕍ar𝑥𝑖 𝔼𝒙𝑖ҧ Y|𝑥𝑖 ≃ sample variance of 𝐸 𝑛

33
Algorithms: (2) Smart Monte Carlo
• Start with two random N sample set

𝐴 𝒙 𝟏 𝑦 (1) 𝐵 ෥(𝟏)
𝒙 𝑦෤ (1)
𝒙(𝟐) 𝑦 (2) ෥(𝟐)
𝒙 𝑦෤ (2)
… … … …

𝑦෤ N
𝒙(𝑵) 𝑦 N ෥(𝑵)
𝒙

• Designed sample set to estimate Sobol indices of 𝑋1


Case 1 Case 2
1 1
𝐴𝐵1 𝑥෤11 𝑥෤2 … 𝑥෤𝐷1 𝑦ො Y Y Designed resampling
Designed resampling
to evaluate Si to evaluate Sj
… … …
Sampling

xi xj
34
Algorithms: (2) Smart Monte Carlo

Saltelli, 2009

35
Algorithms: (3) Probability model-based GSA
• N-MCS samples are required - existing samples can be used!
Estimation algorithm
• Approximate joint distribution of 𝑓 𝑋𝑖 , 𝑌
𝑥1 𝑥2 … 𝑥𝑖 … 𝑥𝐷 𝑦
using a Gaussian mixture model (GMM)
𝑁
• Estimate 𝔼[Y|𝑋𝑖 ] from GMM 𝑓 X𝑖 , 𝑌
(𝑛)
• Repeat for different 𝑋𝑖 samples to get
sample variance
(𝑛)
𝕍ar𝑥𝑖 𝔼 𝒙𝑖 ҧ Y|𝑋𝑖

36
For Thursday class (4/21)

Download Register for a


Bring laptop
quoFEM DesignSafe Account

• quoFEM: https://simcenter.designsafe-ci.org/research-tools/quofem-application/
• DesignSafe: https://www.designsafe-ci.org/account/register/
37
Variance-based Reliability Sensitivity Analysis
• Reliability-oriented sensitivity analysis
• Quantity of interest:

1 𝐺 𝑿 ≤0
𝑞=𝟙 𝐺 𝑿 =ቊ
0 𝐺 𝑿 >0

Bernoulli

•𝐸 𝑞 =𝐸 𝟙 𝐺 𝑿 = 𝑃𝑓
• 𝑉𝑎𝑟 𝑞 = 𝑉𝑎𝑟 𝟙 𝐺 𝑿 = 𝑃𝑓 1 − 𝑃𝑓
38
Reformulation of Sobol index
• Main Sobol index
𝕍ar𝑋𝑖 𝔼𝑿𝑖ҧ 𝑞|𝑋𝑖 𝕍ar𝑋𝑖 𝔼𝑿𝑖ҧ 𝑞|𝑋𝑖
𝑆𝑖 = =
𝕍ar 𝑞 𝑃𝑓 1 − 𝑃𝑓

• Similarly,
𝔼𝒙𝑖ҧ 𝑞|𝑋𝑖 = 𝑃𝑓|𝑋𝑖

𝕍ar𝑋𝑖 𝔼𝒙𝑖ҧ 𝑞|𝑋𝑖 = 𝕍ar𝑋𝑖 𝑃𝑓|𝑋𝑖


2
2
= 𝔼𝑋𝑖 𝑃𝑓|𝑋 𝑖
− 𝔼𝑋𝑖 𝔼𝑿𝑖ҧ 𝑃𝑓|𝑋𝑖
2
= 𝔼𝑋𝑖 𝑃𝑓|𝑋 𝑖
− 𝑃𝑓2

𝕍ar𝑋𝑖 𝔼𝑿𝑖ҧ 𝑞|𝑋𝑖 2 2


𝔼𝑋𝑖 𝑃𝑓|𝑋 𝑖
− 𝑃𝑓
𝑆𝑖 = =
𝑃𝑓 1 − 𝑃𝑓 𝑃𝑓 1 − 𝑃𝑓
39
Reformulation of Sobol index
2 2
𝔼𝑋𝑖 𝑃𝑓|𝑋𝑖
− 𝑃𝑓 - 𝑃𝑓 is the solution of reliability analysis
𝑆𝑖 =
𝑃𝑓 1 − 𝑃𝑓 - How about 𝑃𝑓|𝑋𝑖 ?

Two different combination of Reliability Analysis and Variance


Based Sensitivity analysis:
1. Sobol indices as “by-product” of reliability analysis
• After FORM reliability analysis
• After sampling-based reliability analysis

2. Get Sobol indices “before” running reliability analysis


• Probability model-based GSA

40
Review of FORM - 𝛽 and 𝜶
FORM and Variance-based Sensitivity Analysis
• FORM limit state
𝐺𝐹𝑂𝑅𝑀 𝒛 = ∇𝐺 𝒛∗ 𝒛 − 𝒛∗

or

𝐺𝐹𝑂𝑅𝑀 𝒛 = 𝛽 − 𝜶𝒛

Goal: to derive 𝑺𝒊 in terms of 𝜶𝒊 and 𝜷

𝑃𝑓 = ℙ 𝜶𝒁 ≥ 𝛽 = ℙ 𝛼1 𝑍1 + 𝛼2 𝑍2 + ⋯ + 𝛼𝑑 𝑍𝑑 ≥ 𝛽 = ℙ 𝑍෨ ≥ 𝛽 = Φ −𝛽

Standard normal
42
FORM and Variance-based Sensitivity Analysis

Goal: to derive 𝑺𝒊 in terms of 𝜶 and 𝜷


𝑃𝑓 = ℙ 𝜶𝒁 ≥ 𝛽 = ℙ 𝛼1 𝑍1 + 𝛼2 𝑍2 + ⋯ + 𝛼𝑑 𝑍𝑑 ≥ 𝛽 = ℙ 𝑍෨ ≥ 𝛽 = Φ −𝛽

𝛽 − 𝛼𝑖 𝑧𝑖 𝛽 − 𝛼𝑖 𝑧𝑖
𝑃𝑓|𝑍𝑖 = ℙ 𝜶𝑖ǁ 𝒁𝑖ǁ ≥ 𝛽 − 𝛼𝑖 𝑧𝑖 = ℙ 𝑍෨ ≥ =Φ −
𝜶𝑖 ҧ 𝜶𝑖 ҧ

43
FORM and Variance-based Sensitivity Analysis
2
𝔼𝑧𝑖 𝑃𝑓|𝑍𝑖
= 𝔼𝑧𝑖 𝑃𝑓|𝑧𝑖 𝑃𝑓|𝑧𝑖

𝛼𝑖 𝑧𝑖 − 𝛽 𝛼𝑖 𝑧𝑖 − 𝛽
= 𝔼𝑧𝑖 Φ Φ
𝜶𝑖 ҧ 𝜶𝑖 ҧ

𝛼𝑖 𝑧𝑖 − 𝛽 𝛼𝑖 𝑧𝑖 − 𝛽
= 𝔼𝑧𝑖 ℙ𝒛෤ 𝑍෨1 ≤ ෨
ℙ𝒛෤ 𝑍2 ≤
𝜶𝑖 ҧ 𝜶𝑖 ҧ

𝛼𝑖 𝑧𝑖 − 𝛽 𝛼𝑖 𝑧𝑖 − 𝛽
= 𝔼𝑧𝑖 ℙ𝒛෤ 𝑍෨1 ≤ ∩ 𝑍෨2 ≤
𝜶𝑖 ҧ 𝜶𝑖 ҧ

𝛼𝑖 𝑍𝑖 − 𝛽 𝛼𝑖 𝑍𝑖 − 𝛽
= ℙ𝑧𝑖 ,෤𝒛 𝑍෨1 ≤ ∩ 𝑍෨2 ≤
𝜶𝑖 ҧ 𝜶𝑖 ҧ

= ℙ 𝑌෨1 ≤ −𝛽 ∩ 𝑌෨2 ≤ −𝛽 = Φ2 −𝛽, −𝛽; 𝛼𝑖2

𝑌෨1 = 𝑍෨1 𝜶𝑖ҧ − 𝛼𝑖 𝑍𝑖 𝑌෨1 ~𝑁 0,1 2

𝑌෨2 = 𝑍෨2 𝜶𝑖ҧ − 𝛼𝑖 𝑍𝑖 𝑌෨2 ~𝑁 0,1 2 corr 𝑌෨1 , 𝑌෨2 = 𝛼𝑖2


44
FORM and Variance-based Sensitivity Analysis
• Main-effect Sobol index

2
𝔼𝑍𝑖 𝑃𝑓|𝑍 − 𝑃𝑓2 Φ2 −𝛽, −𝛽; 𝛼𝑖2 − 𝑃𝑓2 𝛼𝑖2
𝑖
1
𝑆𝑖 = = = න 𝜑2 −𝛽, −𝛽; 𝑟 𝑑𝑟
𝑃𝑓 1 − 𝑃𝑓 𝑃𝑓 1 − 𝑃𝑓 𝑃𝑓 1 − 𝑃𝑓 0

• Total-effect Sobol index


Eq. (32) in here
2 2
𝔼𝒁𝑖ҧ 𝑃𝑓|𝒁 − 𝑃𝑓 Φ2 −𝛽, −𝛽; 𝜶𝑖ҧ 2
− 𝑃𝑓2
𝑆𝑖𝑇 =1− 𝑖ҧ
=1−
𝑃𝑓 1 − 𝑃𝑓 𝑃𝑓 1 − 𝑃𝑓
1
1
= න 𝜑2 −𝛽, −𝛽; 𝑟 𝑑𝑟
𝑃𝑓 1 − 𝑃𝑓 1−𝛼𝑖2

45
Example with two Random Variables

46
FORM and Variance-based Sensitivity Analysis

(Papaioannou and Straub, 2021)


47
Sampling-based Reliability Analysis and 𝑆𝑖

48
Sampling-based Reliability Analysis and 𝑆𝑖
• Again, reformulation of Sobol index

𝑃𝑓|𝑋𝑖 = ℙ ℱ 𝑋𝑖
ℙ 𝑋𝑖 ℱ ℙ ℱ
=
ℙ 𝑋𝑖
𝑓𝑋𝑖 |ℱ 𝑋𝑖 𝑑𝑋𝑖 𝑃𝑓
=
𝑓 𝑋𝑖 𝑑𝑋𝑖
𝑓𝑋𝑖|ℱ 𝑋𝑖 𝑃𝑓 Near-optimal Density
=
𝑓 𝑋𝑖

Optimal Density
49
Choi and Song 2017
Sampling-based Reliability Analysis and 𝑆𝑖
• Approximation of 𝑓𝑋𝑖 |ℱ 𝑋𝑖 using kernel density estimation or
cross entropy-based distribution fitting
Kernel density estimation

Mixture distribution fitting

• Estimation of total-effect index is more challenging 50


Sensitivity Analysis before Reliability Analysis
• Probability model-based approach
• Let us define
𝑌=𝐺 𝐗
𝑁 𝑥1 𝑥2 … 𝑥𝑖 … 𝑥𝐷 𝑦
• Approximate joint distribution of 𝑓 𝑋𝑖 , 𝑌
using a Gaussian mixture model (GMM)

𝑓 𝑋𝑖 , 𝑌 ≤ 0
𝑃𝑓|𝑋𝑖 = P Y ≤ 0 𝑋𝑖 =
𝑓(𝑋𝑖 )
0
𝑓 𝑋𝑖 , 𝑌 ≤ 0 = න 𝑓𝑋𝑖,𝑌 𝑋𝑖 , 𝑌 𝑑𝑌
−∞

𝑓 𝑋𝑖 = න 𝑓𝑋𝑖,𝑌 𝑋𝑖 , 𝑌 𝑑𝑌
−∞

• The mixture model “extrapolates” the samples


→ Not accurate for rare events 51
Toy Example

N=500

N=2000

52
Truss Model
Input variables
𝑥1 , 𝑥2 : load1 (𝑃1 ) and load2 (𝑃1 ) with (correlation 0.6)
𝑥3 ~𝑥27 : strength of each member, lognormal

Output variable: limit state function


𝑔 𝑥 = min 𝜎𝑘thr − 𝜎𝑘 𝑃1 , 𝑃2
𝑘=1,… ,25

𝒙 − 𝒔𝒑𝒂𝒄𝒆

𝒖 − 𝒔𝒑𝒂𝒄𝒆 53
Examples
• Structural model: Shear building (Opensees)
• Input parameters
Name Mean C.O.V
w 100 0.1
wR 7
wR 50 0.1

w 6 k 326 0.1
Fy 50 0.1
w 5 alpha 0.2 0.1
factor (PGA) 0.1 0.1
w 4

w 3
• Excitation
Rinaldi
near-field
w 2 Steel 02 Material

54
Nonlinear behavior
• Hysteresis curves for Rinaldi UQ

w 7
R
w 6

w 5

w 4

w 3

w 2

55
Examples

node2 disp

node2 acc

node7 disp

node7 acc

56
Examples: Parameter selection in FE model updating
GSA is performed for 15 parameters

“(…) Parameters 3, 6, 7, 8,
13, and 14 should be
excluded from the
parameter candidates
because they have little
influence over the
objective function.”

Wan, H.P. and Ren, W.X., 2015. Parameter selection in finite-element-model updating by global sensitivity analysis using Gaussian
57
process metamodel. Journal of Structural Engineering, 141(6), p.04014164.
Examples:
Reducing the complexity of multi-objective optimization
Multi-objective Water Distribution System Optimization

• 21 pipes in a system
• 16 Retrofitting options of each pipe :
- 15 available diameters ranging 0.914 - 5.182 m
- Do nothing
• Conflicting objectives:
- Cost: capital cost (pipes, tanks, and pumps) +
operating cost during a design period
- Performance:
eg. surplus power energy per unit weight

New York Tunnels


Rehabilitation
(21 components)

Goal: Maximize the performance, minimize the cost

Fu, G., Kapelan, Z. and Reed, P., 2012. Reducing the complexity of multiobjective water distribution system optimization
through global sensitivity analysis. Journal of Water Resources Planning and Management, 138(3), pp.196-207. 58
Examples: Multi-objective Water Distribution
System Optimization
Best Pareto fronts
First-order and total-order indices

Critical
componen
ts
Convergence rate

0 Simplified Preconditioned
Problem optimization

Fu, G., Kapelan, Z. and Reed, P., 2012. Reducing the complexity of multiobjective water distribution system optimization
through global sensitivity analysis. Journal of Water Resources Planning and Management, 138(3), pp.196-207. 59

You might also like