0% found this document useful (0 votes)
11 views39 pages

Lecture 21

This document summarizes key aspects of stochastic hydrology discussed in lecture 21. It presents 5 case studies applying autoregressive moving average (ARMA) models to different time series data: 1) Daily, monthly, and yearly rainfall in Bangalore - examining time series plots, correlograms, and power spectra. 2) Monthly streamflow in the Cauvery River - selecting ARMA(4,0) and ARMA(1,0) models. 3) Monthly streamflow of another river - using ARMA(8,0) models for generation and forecasting. 4) Annual rainfall in Sakleshpur - choosing ARMA(5,0) for generation and

Uploaded by

Fofo Elorfi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views39 pages

Lecture 21

This document summarizes key aspects of stochastic hydrology discussed in lecture 21. It presents 5 case studies applying autoregressive moving average (ARMA) models to different time series data: 1) Daily, monthly, and yearly rainfall in Bangalore - examining time series plots, correlograms, and power spectra. 2) Monthly streamflow in the Cauvery River - selecting ARMA(4,0) and ARMA(1,0) models. 3) Monthly streamflow of another river - using ARMA(8,0) models for generation and forecasting. 4) Annual rainfall in Sakleshpur - choosing ARMA(5,0) for generation and

Uploaded by

Fofo Elorfi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

INDIAN

 INSTITUTE  OF  SCIENCE  

STOCHASTIC HYDROLOGY
Lecture -21
Course Instructor : Prof. P. P. MUJUMDAR
Department of Civil Engg., IISc.
Summary  of  the  previous  lecture  

– Case study -3: Monthly streamflows at KRS


reservoir
• Validation of the model
– Case study -4: Monthly streamflow of a river
• Plots of Time series, Correlogram, Partial
Autocorrelation function and Power spectrum
• Candidate ARMA models
– Log Likelihood
– Mean square error
– Validation test (Residual mean)

2  
CASE STUDIES -
ARMA MODELS

3  
Case study – 5
Sakleshpur Annual Rainfall Data (1901-2002)
Rainfall in mm

Time (years)

4  
Case study – 5 (Contd.)
Correlogram
n−k
1
ck =
N
∑( X
i =1
t − X )( X t + k − X )

c
rk = k
c0
c0 = S X2

5  
Case study – 5 (Contd.)
PAC function Power spectrum
x105

N 2 2π k
2 ω
I ( k ) = ⎡⎣α k + β k ⎤⎦ k =
2 N
Pp * φp = ρp
n
Auto Correlations 2 2 n
Auto Correlation αk =
N
∑ x cos ( 2π f t )
t k βk = ∑ x sin ( 2π f t )
t k
function Partial Auto Correlation t =1 N t =1

6  
Case study – 5 (Contd.)
Model Likelihood
AR(1) 9.078037
AR(2) 8.562427
AR(3) 8.646781
AR(4) 9.691461
AR(5) 9.821681
AR(6) 9.436822
ARMA(1,1) 8.341717
ARMA(1,2) 8.217627
ARMA(2,1) 7.715415
ARMA(2,2) 5.278434
ARMA(3,1) 6.316174
ARMA(3,2) 6.390390
7  
Case study – 5 (Contd.)
• ARMA(5,0) is selected with highest likelihood
value
• The parameters for the selected model are as
follows
φ1 = 0.40499
φ2 = 0.15223
φ3 = -0.02427
φ4 = -0.2222
φ5 = 0.083435
Constant = -0.000664

8  
Case study – 5 (Contd.)
• Significance of residual mean

Model η(e) t0.95(N )


ARMA(5,0) 0.000005 1.6601

9  
Case study – 5 (Contd.)
Significance of periodicities:

Periodicity η F0.95(2, N-2 )

1st 0.000 3.085


2nd 0.00432 3.085
3rd 0.0168 3.085
4th 0.0698 3.085
5th 0.000006 3.085
6th 0.117 3.085

10  
Case study – 5 (Contd.)
• Whittle s white noise test:

Model η F0.95(n1, N–n1)


ARMA(5,0) 0.163 1.783

11  
Case study – 5 (Contd.)
Model MSE
AR(1) 1.180837
AR(2) 1.169667
AR(3) 1.182210
AR(4) 1.168724
AR(5) 1.254929
AR(6) 1.289385
ARMA(1,1) 1.171668
ARMA(1,2) 1.156298
ARMA(2,1) 1.183397
ARMA(2,2) 1.256068
ARMA(3,1) 1.195626
ARMA(3,2) 27.466087
12  
Case study – 5 (Contd.)
• ARMA(1, 2) is selected with least MSE value for
one step forecasting
• The parameters for the selected model are as
follows
φ1 = 0.35271

θ1 = 0.017124
θ2 = -0.216745
Constant = -0.009267

13  
Case study – 5 (Contd.)
• Significance of residual mean

Model η(e) t0.95(N )


ARMA(1, 2) -0.0026 1.6601

14  
Case study – 5 (Contd.)
Significance of periodicities:

Periodicity η F0.95(2, N-2 )

1st 0.000 3.085


2nd 0.0006 3.085
3rd 0.0493 3.085
4th 0.0687 3.085
5th 0.0003 3.085
6th 0.0719 3.085

15  
Case study – 5 (Contd.)
• Whittle s white noise test:

Model η F0.95(n1, N–n1)


ARMA(1, 2) 0.3605 1.783

16  
SUMMARY OF CASE STUDIES

17  
Summary of Case studies
Case study-1:Time series plot

Daily rainfall data of Bangalore city Monthly rainfall data of Bangalore city
1400

1300

1200

1100

1000

900

800

700

600

500
0 5 10 15 20 25 30 35

Yearly rainfall data of Bangalore city


18  
Summary of Case studies
Case study-1: Correlogram
Sample Autocorrelation Function
0.2

0.15
Sample Autocorrelation

0.1

0.05

-0.05

-0.1
0 100 200 300 400 500 600
Lag

Daily rainfall data of Bangalore city Monthly rainfall data of Bangalore city

Yearly rainfall data of Bangalore city


19  
Summary of Case studies
Case study-1: Power spectrum

Daily rainfall data of Bangalore city Monthly rainfall data of Bangalore city

Yearly rainfall data of Bangalore city


20  
Summary of Case studies
Time series plot
Flow in cumec

Flow in cumec
Time
Time (months)

3. Monthly stream flow data for Cauvery 4. Monthly stream flow data of a river

5. Sakleshpur Annual Rainfall Data


21  
Summary of Case studies
Correlogram

3. Monthly stream flow data for Cauvery 4. Monthly stream flow data of a river

5. Sakleshpur Annual Rainfall Data


22  
Summary of Case studies
Power spectrum
1.2E+09

1E+09

800000000

600000000

400000000

200000000

0
0 1 2 3 4 5 6 7
-2E+08

3. Monthly stream flow data for Cauvery 4. Monthly stream flow data of a river

5. Sakleshpur Annual Rainfall Data


23  
Summary of Case studies
ARMA Models

3. Monthly stream flow data for Cauvery


ARMA(4, 0) – For data generation
ARMA(1, 0) – For one step forecasting

4. Monthly stream flow data of a river


ARMA (8, 0) – For both data generation & one step
forecasting

5. Sakleshpur Annual Rainfall Data


ARMA(5, 0) – For data generation
ARMA(1, 2) – For one step forecasting
24  
MARKOV CHAINS
Markov Chains

• A Markov chain is a stochastic process with the


property that value of process Xt at time t depends
on its value at time t-1 and not on the sequence of
other values (Xt-2 , Xt-3,……. X0) that the process
passed through in arriving at Xt-1.

P [ X t X t −1 , X t −2 ,..... X 0 ] = P [ X t X t −1 ]
Single step Markov
chain  

26  
Markov Chains

P ⎡⎣ X t = a j X t −1 = ai ⎤⎦

• This conditional probability gives the probability at


time t will be in state j , given that the process
was in state i at time t-1.
• The conditional probability is independent of the
states occupied prior to t-1.
• For example, if Xt-1 is a dry day, we would be
interested in the probability that Xt is a dry day or a
wet day.
• This probability is commonly called as transition
probability
27  
Markov Chains
t
P ⎣ X t = a j X t −1 = ai ⎦ = Pij
⎡ ⎤

• Usually written as Pijt indicating the probability of a


step from ai to aj at time ‘t’.
• If Pij is independent of time, then the Markov chain
is said to be homogeneous.

i.e., Pijt = Pijt +τ v t and τ

the transition probabilities remain same across


time
28  
Markov Chains
Transition Probability Matrix(TPM):
t+1 1 2 3 . . m
t
1 ⎡ P11 P12 P13 . . P1m ⎤
⎢ P P22 P23 . . P2 m ⎥⎥
2
⎢ 21
3 ⎢ P31 ⎥
P= ⎢ ⎥
. ⎢ . ⎥
. ⎢ . ⎥
⎢ ⎥
m ⎢⎣ Pm1 Pm 2 Pmm ⎥⎦
m x m  
29  
Markov Chains
m

∑P
j =1
ij =1 V i

• Elements in any row of TPM sum to unity


• TPM can be estimated from observed data by
enumerating the number of times the observed
data went from state i to j
• Pj (n) is the probability of being in state j in time
step n .

30  
Markov Chains
• pj(0) is the probability of being in state j in period
t = 0.

p(0) = ⎡⎣ p1(0) p2(0) . . pm(0) ⎤⎦ …. Probability


1× m vector at time 0

(n) ( n) (n) ( n ) ⎤
p = ⎣ p1
⎡ p2 . . pm ⎦ …. Probability
1× m vector at time
n
• If p(0) is given and TPM is given
(1) (0)
p = p ×P

31  
Markov Chains
⎡ P11 P12 P13 . . P1m ⎤
⎢ P P22 P23 . . P2 m ⎥⎥
⎢ 21
p (1) = ⎡⎣ p1(0) p2(0) . . pm(0) ⎤⎦ ⎢ P31 ⎥
⎢ ⎥
⎢ . ⎥
⎢⎣ Pm1 Pm 2 Pmm ⎥⎦

= p1(0) P11 + p2(0) P21 + .... + pm(0) Pm1 …. Probability of


going to state 1

( 0) ( 0) ( 0)
= p1 P12 + p2 P21 + .... + pm Pm2 …. Probability of
going to state 2
And so on…

32  
Markov Chains
Therefore

(1) (1) (1) (1) ⎤


p = ⎣ p1
⎡ p2 . . pm ⎦
1× m

p( 2) = p(1) × P
= p(0) × P × P
(0)
= p × P2
In general,
(n) (0) n
p = p ×P
33  
Markov Chains
• As the process advances in time, pj(n) becomes less
dependent on p(0)
• The probability of being in state j after a large
number of time steps becomes independent of the
initial state of the process.
• The process reaches a steady state at large n
n
p = p× P
• As the process reaches steady state, the
probability vector remains constant

34  
Example – 1
Consider the TPM for a 2-state first order homogeneous
Markov chain as

⎡0.7 0.3⎤
TPM = ⎢ ⎥
⎣ 0.4 0.6 ⎦

State 1 is a non-rainy day and state 2 is a rainy day


Obtain the
1. probability of day 1 is non-rainfall day / day 0 is rainfall day
2. probability of day 2 is rainfall day / day 0 is non-rainfall day
3. probability of day 100 is rainfall day / day 0 is non-rainfall
day
35  
Example – 1 (contd.)
1. probability of day 1 is non-rainfall day / day 0 is
rainfall day
No rain rain
No rain ⎡0.7 0.3⎤
TPM =
rain ⎢ 0.4 0.6⎥⎦
⎣
The probability is 0.4

2. probability of day 2 is rainfall day / day 0 is non-


rainfall day

p( 2) = p (0) × P 2

36  
Example – 1 (contd.)
( 2) ⎡0.7 0.3⎤
p = [0.7 0.3] ⎢ ⎥
⎣ 0.4 0.6 ⎦
= [0.61 0.39]
The probability is 0.39

3. probability of day 100 is rainfall day / day 0 is non-


rainfall day

(n) (0)
p = p × Pn

37  
Example – 1 (contd.)
P2 = P × P
⎡0.7 0.3⎤ ⎡0.7 0.3⎤ ⎡ 0.61 0.39 ⎤
= ⎢ ⎥ ⎢ ⎥ = ⎢ ⎥
⎣ 0.4 0.6 ⎦ ⎣ 0.4 0.6 ⎦ ⎣ 0.52 0.48 ⎦
4 2 2 ⎡0.5749 0.4251⎤
P = P × P = ⎢ ⎥
⎣ 0.5668 0.4332 ⎦
8 4 ⎡ 0.5715
4 0.4285 ⎤
P = P × P = ⎢
⎣0.5714 0.4286 ⎥⎦
16 8 8 ⎡0.5714 0.4286 ⎤
P = P × P = ⎢
⎣0.5714 0.4286 ⎥⎦

38  
Example – 1 (contd.)
Steady state probability
p = [0.5714 0.4286]

For steady state,


p = p × Pn
⎡0.5714 0.4286 ⎤
= [0.5714 0.4286] ⎢ ⎥
⎣ 0.5714 0.4286 ⎦
= [0.5714 0.4286]

39  

You might also like