1sensitivity Lecture Slides
1sensitivity Lecture Slides
Motivation
• Consider a numerical model of a structure
Inputs Model Output
𝐴
𝐸
… midspan
displacement
𝑓𝑦
…
Failure index (0,1)
𝑃2
2
Applications of GSA
• To gain insights
• How different parameters and their interactions affect a system
• Dimensionality reduction
• By identifying uninfluential (redundant) factors
• Model diagnostics
• After developing a model, one may compare GSA results
with expert knowledge
4
Local Sensitivity Analysis 𝜕𝑔(𝑿)
𝜕𝑥𝑖
• Rate of change (slope)
𝑥∗
𝜕𝑔(𝑿)
𝑆𝑖𝐷 𝑿 =
𝜕𝑋𝑖
• One-factor-at-a-time evaluation
• Used in reliability analysis / optimization
optimization
5
𝜕𝑔(𝑿)
Local Sensitivity Analysis 𝜕𝑥𝑖
6
Local Sensitivity Analysis
𝑌 = 𝑔(𝑋1 , 𝑋2 ) = 2𝑋1 + 𝑋2
𝑋1 ~𝑁 0,0.52 , 𝑋2 ~𝑁 0,52
𝑌 vs. 𝑋1 𝑌 vs. 𝑋2 7
Local Sensitivity Analysis
• Sigma-normalized derivative
𝜎𝑋𝑖 𝜕𝑔(𝑿)
𝑆𝑖𝑆𝐷 𝑿 =
𝜎𝑌 𝜕𝑋𝑖
𝑌 vs. 𝑋1 𝑌 vs. 𝑋2
8
Local Sensitivity Analysis
• ‘Partial derivative’ in the standard random variable domain
• When the random variables are independent, each variable can be
transformed to the standard normal variable, 𝑍𝑖 = 𝑇(𝑋𝑖 ).
• Example: FORM analysis
Importance vector:
normalized gradient
∇𝐺 𝒛∗
𝜶=−
∇𝐺 𝒛∗
Note 𝒛∗
𝜶=−
𝛽
For dependent
variables
9
Variance-based Sensitivity
• Intuition behind the Sobol indices
𝑥𝑖 11
Variance Decomposition
• 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 is a measure of sensitivity
• The Law of Total Variance
• Derivation
𝕍ar Y = 𝔼 𝑌 2 − 𝔼 𝑌 2
2
= 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 2 |𝑥𝑖 − 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌|𝑥𝑖
2
= 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖 + 𝔼𝐱𝑖ҧ 𝑌 𝑥𝑖 2 − 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖
2
2
= 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖 + 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑥𝑖 − 𝔼𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌|𝑥𝑖
= 𝔼𝑥𝑖 𝕍ar𝐱𝑖ҧ Y|𝑥𝑖 + 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 12
Variance Decomposition
• 𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ Y|𝑥𝑖 is a measure of sensitivity
• The Law of Total Variance
13
Variance Decomposition
• The Law of Total Variance
𝔼 𝕍ar Y|𝑥𝑖
𝑆𝑖 = 1 −
𝕍ar Y
𝑆𝑖 𝑆𝑖𝑗 𝑆𝑗
𝑌 𝑌 𝑌
(1) (1) (1)
𝑋𝑗 𝑋𝑗 𝑋𝑗
(2)
𝑋𝑗 (2)
𝑋𝑗 𝑋𝑗
(2)
𝑋𝑖 𝑋𝑖 𝑋𝑖
No interaction Interaction between 𝑋𝑖 and 𝑋𝑗
𝑆𝑖𝑗 = 0 𝑆𝑖𝑗 > 0
1 = 𝑆𝑖 + 𝑆𝑖𝑗 + … + 𝑆1,2,…,𝑑
𝑖 𝑖<𝑗
𝑆13
𝑆12 𝑆123 𝑆3
𝑆23 𝑋3
𝑆2
𝑋2
=1
17
Total-effect Index
Conditioning on all
variables but 𝑋𝑖
𝑆𝑖𝑇 accounts for all the interaction
effects associated with a variable 𝑋𝑖
18
Total-effect Index
• For example, consider a function
𝑌 = 𝑔(𝑋1 , 𝑋2 , 𝑋3 )
Total index
Total-effect index for 𝑋1 is
𝑋1 Main index
𝑆1𝑇 = 1 − 𝑆23 − 𝑆2 − 𝑆3
𝑆1
𝑆13
When the variables are independent
𝑆12 𝑆123 𝑆3
𝑆1𝑇 = 𝑆1 + 𝑆12 + 𝑆13 + 𝑆123 𝑋3
𝑆23
𝑆2
𝑋2
19
Analysis of Variance (ANOVA) Decomposition
Consider uncorrelated 𝑿 distributed within a unit hyper-cube
𝑌 = 𝑔(𝑿)
The function can be expanded as
=0
𝔼[∙] on both sides , i.e. integrate over 𝑿
𝔼 𝑌 = 𝑔0
𝔼[∙ |𝑿𝒖 ] on both sides, i.e. integrate over all but 𝒖 ⊆ 1,2, … , 𝑑
𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 = 𝑔0 + 𝑔𝑖 𝑋𝑖
….
25
Analysis of Variance (ANOVA) Decomposition
𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 = 𝑔0 + 𝑔𝑖 𝑋𝑖
𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 , 𝑋𝑗 = 𝑔0 + 𝑔𝑖 𝑋𝑖 +𝑔𝑗 𝑋𝑗 + 𝑔𝑖𝑗 𝑋𝑖 , 𝑋𝑗
….
𝕍ar𝑥𝑖 [𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 ] = 𝑉𝑖
𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 , 𝑋𝑗 = 𝑉𝑖 + 𝑉𝑗 + 𝑉𝑖𝑗
𝑋𝑖 𝑋𝑗
….
𝑆𝑖 𝑆𝑖𝑗 𝑆𝑗
𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 𝑉𝑖
= = 𝑆𝑖
𝕍ar[𝑌] 𝑉
𝕍ar𝑥𝑖 𝔼𝐱𝑖ҧ 𝑌 𝑋𝑖 , 𝑋𝑗 𝑉𝑖 𝑉𝑗 𝑉𝑖𝑗
= + + = 𝑆𝑖 + 𝑆𝑗 + 𝑆𝑖𝑗
𝕍ar[𝑌] 𝑉 𝑉 𝑉
…. 26
ANOVA vs. The Law of Total Variance
Consider 𝑋1 ,
ANOVA 𝑑
27
Remarks
• When variables are correlated
• The Law of Total Variance does not require the assumption of independence
• Intuitive interpretation still holds
28
Remarks
• When 𝑿 are independent random variables, the sensitivity indices are
invariant to any one-on-one transformation of input 𝑍𝑖 = 𝑇𝑖 (𝑋𝑖 )
𝑍𝑖 = 𝑇𝑖 (𝑋𝑖 )
𝑍1 𝑋1
𝑍2 𝑋2
𝑋3
𝑌 = 𝑔(𝑋) 𝑌 𝑌෨ = 𝑎𝑌 + 𝑏
𝑍3
𝑍4 𝑋4 Linear
transform
One-on-one
transform
𝕍ar[𝑌𝐹𝑂𝑅𝑀 ] = ∇𝐺 𝒛∗ 2
𝕍ar[𝔼 𝑌𝐹𝑂𝑅𝑀 𝑧𝑖 ]]
𝜕𝐺 𝒛∗ 𝑆𝑖 = = 𝛼2
𝔼 𝑌𝐹𝑂𝑅𝑀 𝑧𝑖 ] = 𝑧𝑖 𝕍ar 𝑌𝐹𝑂𝑅𝑀
𝜕𝑧𝑖
𝜕𝐺 𝒛∗ 2
𝕍ar[𝔼 𝑌𝐹𝑂𝑅𝑀 𝑧𝑖 ]] = 𝜕𝑧𝑖
31
32
Algorithms: (1) Monte Carlo Estimation
Requires two-fold integration for “variance” and “mean” operation
For n=1:N
(𝑛)
sample 𝑥𝑖
33
Algorithms: (2) Smart Monte Carlo
• Start with two random N sample set
𝐴 𝒙 𝟏 𝑦 (1) 𝐵 (𝟏)
𝒙 𝑦 (1)
𝒙(𝟐) 𝑦 (2) (𝟐)
𝒙 𝑦 (2)
… … … …
𝑦 N
𝒙(𝑵) 𝑦 N (𝑵)
𝒙
xi xj
34
Algorithms: (2) Smart Monte Carlo
Saltelli, 2009
35
Algorithms: (3) Probability model-based GSA
• N-MCS samples are required - existing samples can be used!
Estimation algorithm
• Approximate joint distribution of 𝑓 𝑋𝑖 , 𝑌
𝑥1 𝑥2 … 𝑥𝑖 … 𝑥𝐷 𝑦
using a Gaussian mixture model (GMM)
𝑁
• Estimate 𝔼[Y|𝑋𝑖 ] from GMM 𝑓 X𝑖 , 𝑌
(𝑛)
• Repeat for different 𝑋𝑖 samples to get
sample variance
(𝑛)
𝕍ar𝑥𝑖 𝔼 𝒙𝑖 ҧ Y|𝑋𝑖
36
For Thursday class (4/21)
• quoFEM: https://simcenter.designsafe-ci.org/research-tools/quofem-application/
• DesignSafe: https://www.designsafe-ci.org/account/register/
37
Variance-based Reliability Sensitivity Analysis
• Reliability-oriented sensitivity analysis
• Quantity of interest:
1 𝐺 𝑿 ≤0
𝑞=𝟙 𝐺 𝑿 =ቊ
0 𝐺 𝑿 >0
Bernoulli
•𝐸 𝑞 =𝐸 𝟙 𝐺 𝑿 = 𝑃𝑓
• 𝑉𝑎𝑟 𝑞 = 𝑉𝑎𝑟 𝟙 𝐺 𝑿 = 𝑃𝑓 1 − 𝑃𝑓
38
Reformulation of Sobol index
• Main Sobol index
𝕍ar𝑋𝑖 𝔼𝑿𝑖ҧ 𝑞|𝑋𝑖 𝕍ar𝑋𝑖 𝔼𝑿𝑖ҧ 𝑞|𝑋𝑖
𝑆𝑖 = =
𝕍ar 𝑞 𝑃𝑓 1 − 𝑃𝑓
• Similarly,
𝔼𝒙𝑖ҧ 𝑞|𝑋𝑖 = 𝑃𝑓|𝑋𝑖
40
Review of FORM - 𝛽 and 𝜶
FORM and Variance-based Sensitivity Analysis
• FORM limit state
𝐺𝐹𝑂𝑅𝑀 𝒛 = ∇𝐺 𝒛∗ 𝒛 − 𝒛∗
or
𝐺𝐹𝑂𝑅𝑀 𝒛 = 𝛽 − 𝜶𝒛
𝑃𝑓 = ℙ 𝜶𝒁 ≥ 𝛽 = ℙ 𝛼1 𝑍1 + 𝛼2 𝑍2 + ⋯ + 𝛼𝑑 𝑍𝑑 ≥ 𝛽 = ℙ 𝑍෨ ≥ 𝛽 = Φ −𝛽
Standard normal
42
FORM and Variance-based Sensitivity Analysis
𝛽 − 𝛼𝑖 𝑧𝑖 𝛽 − 𝛼𝑖 𝑧𝑖
𝑃𝑓|𝑍𝑖 = ℙ 𝜶𝑖ǁ 𝒁𝑖ǁ ≥ 𝛽 − 𝛼𝑖 𝑧𝑖 = ℙ 𝑍෨ ≥ =Φ −
𝜶𝑖 ҧ 𝜶𝑖 ҧ
43
FORM and Variance-based Sensitivity Analysis
2
𝔼𝑧𝑖 𝑃𝑓|𝑍𝑖
= 𝔼𝑧𝑖 𝑃𝑓|𝑧𝑖 𝑃𝑓|𝑧𝑖
𝛼𝑖 𝑧𝑖 − 𝛽 𝛼𝑖 𝑧𝑖 − 𝛽
= 𝔼𝑧𝑖 Φ Φ
𝜶𝑖 ҧ 𝜶𝑖 ҧ
𝛼𝑖 𝑧𝑖 − 𝛽 𝛼𝑖 𝑧𝑖 − 𝛽
= 𝔼𝑧𝑖 ℙ𝒛 𝑍෨1 ≤ ෨
ℙ𝒛 𝑍2 ≤
𝜶𝑖 ҧ 𝜶𝑖 ҧ
𝛼𝑖 𝑧𝑖 − 𝛽 𝛼𝑖 𝑧𝑖 − 𝛽
= 𝔼𝑧𝑖 ℙ𝒛 𝑍෨1 ≤ ∩ 𝑍෨2 ≤
𝜶𝑖 ҧ 𝜶𝑖 ҧ
𝛼𝑖 𝑍𝑖 − 𝛽 𝛼𝑖 𝑍𝑖 − 𝛽
= ℙ𝑧𝑖 ,𝒛 𝑍෨1 ≤ ∩ 𝑍෨2 ≤
𝜶𝑖 ҧ 𝜶𝑖 ҧ
2
𝔼𝑍𝑖 𝑃𝑓|𝑍 − 𝑃𝑓2 Φ2 −𝛽, −𝛽; 𝛼𝑖2 − 𝑃𝑓2 𝛼𝑖2
𝑖
1
𝑆𝑖 = = = න 𝜑2 −𝛽, −𝛽; 𝑟 𝑑𝑟
𝑃𝑓 1 − 𝑃𝑓 𝑃𝑓 1 − 𝑃𝑓 𝑃𝑓 1 − 𝑃𝑓 0
45
Example with two Random Variables
46
FORM and Variance-based Sensitivity Analysis
48
Sampling-based Reliability Analysis and 𝑆𝑖
• Again, reformulation of Sobol index
𝑃𝑓|𝑋𝑖 = ℙ ℱ 𝑋𝑖
ℙ 𝑋𝑖 ℱ ℙ ℱ
=
ℙ 𝑋𝑖
𝑓𝑋𝑖 |ℱ 𝑋𝑖 𝑑𝑋𝑖 𝑃𝑓
=
𝑓 𝑋𝑖 𝑑𝑋𝑖
𝑓𝑋𝑖|ℱ 𝑋𝑖 𝑃𝑓 Near-optimal Density
=
𝑓 𝑋𝑖
Optimal Density
49
Choi and Song 2017
Sampling-based Reliability Analysis and 𝑆𝑖
• Approximation of 𝑓𝑋𝑖 |ℱ 𝑋𝑖 using kernel density estimation or
cross entropy-based distribution fitting
Kernel density estimation
𝑓 𝑋𝑖 , 𝑌 ≤ 0
𝑃𝑓|𝑋𝑖 = P Y ≤ 0 𝑋𝑖 =
𝑓(𝑋𝑖 )
0
𝑓 𝑋𝑖 , 𝑌 ≤ 0 = න 𝑓𝑋𝑖,𝑌 𝑋𝑖 , 𝑌 𝑑𝑌
−∞
∞
𝑓 𝑋𝑖 = න 𝑓𝑋𝑖,𝑌 𝑋𝑖 , 𝑌 𝑑𝑌
−∞
N=500
N=2000
52
Truss Model
Input variables
𝑥1 , 𝑥2 : load1 (𝑃1 ) and load2 (𝑃1 ) with (correlation 0.6)
𝑥3 ~𝑥27 : strength of each member, lognormal
𝒙 − 𝒔𝒑𝒂𝒄𝒆
𝒖 − 𝒔𝒑𝒂𝒄𝒆 53
Examples
• Structural model: Shear building (Opensees)
• Input parameters
Name Mean C.O.V
w 100 0.1
wR 7
wR 50 0.1
w 6 k 326 0.1
Fy 50 0.1
w 5 alpha 0.2 0.1
factor (PGA) 0.1 0.1
w 4
w 3
• Excitation
Rinaldi
near-field
w 2 Steel 02 Material
54
Nonlinear behavior
• Hysteresis curves for Rinaldi UQ
w 7
R
w 6
w 5
w 4
w 3
w 2
55
Examples
node2 disp
node2 acc
node7 disp
node7 acc
56
Examples: Parameter selection in FE model updating
GSA is performed for 15 parameters
“(…) Parameters 3, 6, 7, 8,
13, and 14 should be
excluded from the
parameter candidates
because they have little
influence over the
objective function.”
Wan, H.P. and Ren, W.X., 2015. Parameter selection in finite-element-model updating by global sensitivity analysis using Gaussian
57
process metamodel. Journal of Structural Engineering, 141(6), p.04014164.
Examples:
Reducing the complexity of multi-objective optimization
Multi-objective Water Distribution System Optimization
• 21 pipes in a system
• 16 Retrofitting options of each pipe :
- 15 available diameters ranging 0.914 - 5.182 m
- Do nothing
• Conflicting objectives:
- Cost: capital cost (pipes, tanks, and pumps) +
operating cost during a design period
- Performance:
eg. surplus power energy per unit weight
Fu, G., Kapelan, Z. and Reed, P., 2012. Reducing the complexity of multiobjective water distribution system optimization
through global sensitivity analysis. Journal of Water Resources Planning and Management, 138(3), pp.196-207. 58
Examples: Multi-objective Water Distribution
System Optimization
Best Pareto fronts
First-order and total-order indices
Critical
componen
ts
Convergence rate
0 Simplified Preconditioned
Problem optimization
Fu, G., Kapelan, Z. and Reed, P., 2012. Reducing the complexity of multiobjective water distribution system optimization
through global sensitivity analysis. Journal of Water Resources Planning and Management, 138(3), pp.196-207. 59