Lecture 21
Lecture 21
STOCHASTIC HYDROLOGY
Lecture -21
Course Instructor : Prof. P. P. MUJUMDAR
Department of Civil Engg., IISc.
Summary
of
the
previous
lecture
2
CASE STUDIES -
ARMA MODELS
3
Case study – 5
Sakleshpur Annual Rainfall Data (1901-2002)
Rainfall in mm
Time (years)
4
Case study – 5 (Contd.)
Correlogram
n−k
1
ck =
N
∑( X
i =1
t − X )( X t + k − X )
c
rk = k
c0
c0 = S X2
5
Case study – 5 (Contd.)
PAC function Power spectrum
x105
N 2 2π k
2 ω
I ( k ) = ⎡⎣α k + β k ⎤⎦ k =
2 N
Pp * φp = ρp
n
Auto Correlations 2 2 n
Auto Correlation αk =
N
∑ x cos ( 2π f t )
t k βk = ∑ x sin ( 2π f t )
t k
function Partial Auto Correlation t =1 N t =1
6
Case study – 5 (Contd.)
Model Likelihood
AR(1) 9.078037
AR(2) 8.562427
AR(3) 8.646781
AR(4) 9.691461
AR(5) 9.821681
AR(6) 9.436822
ARMA(1,1) 8.341717
ARMA(1,2) 8.217627
ARMA(2,1) 7.715415
ARMA(2,2) 5.278434
ARMA(3,1) 6.316174
ARMA(3,2) 6.390390
7
Case study – 5 (Contd.)
• ARMA(5,0) is selected with highest likelihood
value
• The parameters for the selected model are as
follows
φ1 = 0.40499
φ2 = 0.15223
φ3 = -0.02427
φ4 = -0.2222
φ5 = 0.083435
Constant = -0.000664
8
Case study – 5 (Contd.)
• Significance of residual mean
9
Case study – 5 (Contd.)
Significance of periodicities:
10
Case study – 5 (Contd.)
• Whittle s white noise test:
11
Case study – 5 (Contd.)
Model MSE
AR(1) 1.180837
AR(2) 1.169667
AR(3) 1.182210
AR(4) 1.168724
AR(5) 1.254929
AR(6) 1.289385
ARMA(1,1) 1.171668
ARMA(1,2) 1.156298
ARMA(2,1) 1.183397
ARMA(2,2) 1.256068
ARMA(3,1) 1.195626
ARMA(3,2) 27.466087
12
Case study – 5 (Contd.)
• ARMA(1, 2) is selected with least MSE value for
one step forecasting
• The parameters for the selected model are as
follows
φ1 = 0.35271
θ1 = 0.017124
θ2 = -0.216745
Constant = -0.009267
13
Case study – 5 (Contd.)
• Significance of residual mean
14
Case study – 5 (Contd.)
Significance of periodicities:
15
Case study – 5 (Contd.)
• Whittle s white noise test:
16
SUMMARY OF CASE STUDIES
17
Summary of Case studies
Case study-1:Time series plot
Daily rainfall data of Bangalore city Monthly rainfall data of Bangalore city
1400
1300
1200
1100
1000
900
800
700
600
500
0 5 10 15 20 25 30 35
0.15
Sample Autocorrelation
0.1
0.05
-0.05
-0.1
0 100 200 300 400 500 600
Lag
Daily rainfall data of Bangalore city Monthly rainfall data of Bangalore city
Daily rainfall data of Bangalore city Monthly rainfall data of Bangalore city
Flow in cumec
Time
Time (months)
3. Monthly stream flow data for Cauvery 4. Monthly stream flow data of a river
3. Monthly stream flow data for Cauvery 4. Monthly stream flow data of a river
1E+09
800000000
600000000
400000000
200000000
0
0 1 2 3 4 5 6 7
-2E+08
3. Monthly stream flow data for Cauvery 4. Monthly stream flow data of a river
P [ X t X t −1 , X t −2 ,..... X 0 ] = P [ X t X t −1 ]
Single step Markov
chain
26
Markov Chains
P ⎡⎣ X t = a j X t −1 = ai ⎤⎦
∑P
j =1
ij =1 V i
30
Markov Chains
• pj(0) is the probability of being in state j in period
t = 0.
(n) ( n) (n) ( n ) ⎤
p = ⎣ p1
⎡ p2 . . pm ⎦ …. Probability
1× m vector at time
n
• If p(0) is given and TPM is given
(1) (0)
p = p ×P
31
Markov Chains
⎡ P11 P12 P13 . . P1m ⎤
⎢ P P22 P23 . . P2 m ⎥⎥
⎢ 21
p (1) = ⎡⎣ p1(0) p2(0) . . pm(0) ⎤⎦ ⎢ P31 ⎥
⎢ ⎥
⎢ . ⎥
⎢⎣ Pm1 Pm 2 Pmm ⎥⎦
( 0) ( 0) ( 0)
= p1 P12 + p2 P21 + .... + pm Pm2 …. Probability of
going to state 2
And so on…
32
Markov Chains
Therefore
p( 2) = p(1) × P
= p(0) × P × P
(0)
= p × P2
In general,
(n) (0) n
p = p ×P
33
Markov Chains
• As the process advances in time, pj(n) becomes less
dependent on p(0)
• The probability of being in state j after a large
number of time steps becomes independent of the
initial state of the process.
• The process reaches a steady state at large n
n
p = p× P
• As the process reaches steady state, the
probability vector remains constant
34
Example – 1
Consider the TPM for a 2-state first order homogeneous
Markov chain as
⎡0.7 0.3⎤
TPM = ⎢ ⎥
⎣ 0.4 0.6 ⎦
p( 2) = p (0) × P 2
36
Example – 1 (contd.)
( 2) ⎡0.7 0.3⎤
p = [0.7 0.3] ⎢ ⎥
⎣ 0.4 0.6 ⎦
= [0.61 0.39]
The probability is 0.39
(n) (0)
p = p × Pn
37
Example – 1 (contd.)
P2 = P × P
⎡0.7 0.3⎤ ⎡0.7 0.3⎤ ⎡ 0.61 0.39 ⎤
= ⎢ ⎥ ⎢ ⎥ = ⎢ ⎥
⎣ 0.4 0.6 ⎦ ⎣ 0.4 0.6 ⎦ ⎣ 0.52 0.48 ⎦
4 2 2 ⎡0.5749 0.4251⎤
P = P × P = ⎢ ⎥
⎣ 0.5668 0.4332 ⎦
8 4 ⎡ 0.5715
4 0.4285 ⎤
P = P × P = ⎢
⎣0.5714 0.4286 ⎥⎦
16 8 8 ⎡0.5714 0.4286 ⎤
P = P × P = ⎢
⎣0.5714 0.4286 ⎥⎦
38
Example – 1 (contd.)
Steady state probability
p = [0.5714 0.4286]
39