0% found this document useful (0 votes)
134 views24 pages

Time Series Analysis

1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 -4.99 -3.84 -2.75 3.83 11.65 13.1 16.34 14.44 10.76 5.26 -0.29 -8.34 -0.32 -0.25 -2.62 -0.97 1.79 -0.61 1.03 0.21 -0.01

Uploaded by

Ezra Ahumuza
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
134 views24 pages

Time Series Analysis

1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 -4.99 -3.84 -2.75 3.83 11.65 13.1 16.34 14.44 10.76 5.26 -0.29 -8.34 -0.32 -0.25 -2.62 -0.97 1.79 -0.61 1.03 0.21 -0.01

Uploaded by

Ezra Ahumuza
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 24

VI) Time series analysis

Step 1: Check if there is a periodic component in the monthly temperature, if yes, remove it.
The monthly temperature data was checked for seasonality/periodicity using the web-based OS X
Software for spectral estimation, prediction, and trend/noise extraction with seasonal cycle set at 12.
From the graph it is clear that there is a periodic component in the series. The software is located at:
http://smoothforecast.com/SmoothForecastPHP/index.php. Further plot of a periodogram was
obtained from advanced MS Excel times series analysis add ins as shown below.

Periodogram (Residual Series)


70

60

Periodogram

50

40

30

20

10

0
0

0.5

1.5

2.5

3.5

Frequency [0,Pi]

Time step / Residual Series


8

Residual Series(T-Tbar) -Stationary Series

0
0

20

40

60

80

100

120

140

-2

-4

-6

-8

-10

Time step

It is clear from the plot of monthly temperature data that there is a periodic variation. This
periodicity can better be seen if you calculate and plot the autocorrelation up to 12 steps or more.

The periodic component of the monthly temperature can be modelled with a model, like
( )

[(

)(

)]

Where t is the time in month, i.e., second column of the data file. Parameter a equals the mean
monthly temperature, b is an amplification parameter, the parameter c is a phase parameter. For our
monthly temperature data, I have briefly manually calibrated that a = 5.32, b = 10 and c = 4.1. You
can use these values or get you own values.
The result of the sin curve model is as follows, where below line is the observed temperature, and
red line the modeled temperature.

Removal of Periodicity:
Removing the periodic component from the temperature series is carried out as follows;
()

( )

The results are shown in the table below; predicted values of temperature were computed using the
model:
( )

[(

)(

)]

Predicted Values
( ( ))
Year
1981
1981
1981
1981
1981
1981
1981
1981
1981
1981
1981
1981
1982
1982
1982
1982
1982
1982
1982
1982
1982
1982
1982
1982
1983
1983
1983
1983
1983
1983
1983
1983
1983
1983
1983
1983
1984
1984
1984
1984
1984
1984
1984

Month
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7

No
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43

Temp( ( ))
-4.99
-3.84
-2.75
3.83
11.65
13.1
16.34
14.44
10.76
5.26
-0.29
-8.34
-9.58
-4.76
0.98
4.01
9.88
12.51
17.28
16.02
10.87
6.08
3.25
-1.12
-0.47
-5.2
-0.95
3.87
10.59
14.08
17.33
16.17
11.33
6.48
0.08
-2.56
-5.43
-3.19
-3.61
4.92
11.55
13.57
15.25

Residuals( ( ) ( )) Stationary Series


-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31

-0.32
-0.25
-2.62
-0.97
1.79
-0.61
1.03
0.21
-0.01
-0.58
-1.07
-5.27
-4.91
-1.17
1.11
-0.79
0.02
-1.20
1.97
1.79
0.10
0.24
2.47
1.95
4.20
-1.61
-0.82
-0.93
0.73
0.37
2.02
1.94
0.56
0.64
-0.70
0.51
-0.76
0.40
-3.48
0.12
1.69
-0.14
-0.06

1984
1984
1984
1984
1984
1985
1985
1985
1985
1985
1985
1985
1985
1985
1985
1985
1985
1986
1986
1986
1986
1986
1986
1986
1986
1986
1986
1986
1986
1987
1987
1987
1987
1987
1987
1987
1987
1987
1987
1987
1987
1988
1988
1988
1988
1988
1988
1988

8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7

44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91

15.37
9.87
7.8
2.41
-0.15
-10.87
-13.57
-1.65
1.43
9.82
14.03
15.95
14.96
9.19
6.75
-1.21
-7.55
-6.98
-10.25
0.33
1.91
12.36
16.1
16.47
12.77
7.55
5.92
3.93
-2.86
-13.88
-5.73
-5.32
4.32
8.2
11.85
15.09
12.19
9.35
7.29
1.1
-2.89
-0.31
-1.54
-2.59
2.99
12
15.78
17.41

14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31

1.14
-0.90
1.96
1.63
2.92
-6.20
-9.98
-1.52
-3.37
-0.04
0.32
0.64
0.73
-1.58
0.91
-1.99
-4.48
-2.31
-6.66
0.46
-2.89
2.50
2.39
1.16
-1.46
-3.22
0.08
3.15
0.21
-9.21
-2.14
-5.19
-0.48
-1.66
-1.86
-0.22
-2.04
-1.42
1.45
0.32
0.18
4.36
2.05
-2.46
-1.81
2.14
2.07
2.10

1988
1988
1988
1988
1988
1989
1989
1989
1989
1989
1989
1989
1989
1989
1989
1989
1989
1990
1990
1990
1990
1990
1990
1990
1990
1990
1990
1990
1990
1991
1991
1991
1991
1991
1991
1991
1991
1991
1991
1991
1991

8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12

92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132

14.59
11.65
4.31
-2.83
-4.71
1.98
1.79
2.57
4.89
11.86
14.7
16.74
14.61
11.67
6.24
0.83
-4.38
-0.61
2.96
3.6
6.33
11.79
14.63
15.35
15.9
9.38
5.89
-0.18
-0.98
-2.12
-4.25
1.46
5.03
8.69
11.66
17.66
16.44
10.63
6.18
2.33
-0.68

14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07
-4.67
-3.59
-0.13
4.80
9.86
13.71
15.31
14.23
10.77
5.84
0.78
-3.07

0.36
0.88
-1.53
-3.61
-1.64
6.65
5.38
2.70
0.09
2.00
0.99
1.43
0.38
0.90
0.40
0.05
-1.31
4.06
6.55
3.73
1.53
1.93
0.92
0.04
1.67
-1.39
0.05
-0.96
2.09
2.55
-0.66
1.59
0.23
-1.17
-2.05
2.35
2.21
-0.14
0.34
1.55
2.39

Step 2: Use the linear regression method with temperature (with periodic component removed) as
dependent variable and sequence number (third column of you data file) as independent variable to
check if there is a trend in the monthly temperature, if yes, remove the trend.

X Variable 1 Line Fit Plot


8.000

6.000

4.000

2.000
y = 0.0132x - 0.8679
R = 0.0396
0.000

20

40

60

80

100

120

140

-2.000

-4.000
Y
-6.000

Predicted Y
Linear (Y)

-8.000

-10.000

-12.000

X Variable 1

From the regression plot, it is observed that a=-0.8679 and b=0.0132. Thus, the regression
equation used to predict values of x (residual time series) is:

The tabulated values are as shown in the following table;

Testing for the existence of trend;


Hypothesis to be tested:

Test Statistic is:

| |

Where;

and

From the t-distribution tables, we read off the value of t0.975,130 = 1.960. This value is less than the
computed test statistic. Thus we reject the null hypothesis that there is no trend at the chosen
significance level.
Alternate Tests:
The same data was subjected to the alternate Mann- Kendall trend tests to supplement the
regression method in assessing possible trends. The results are summarized below;

Mann-Kendall trend test / Two-tailed test on (Residual Series):


Kendall's tau
0.135
S
1170.000
Var(S)
258419.333
p-value (Two-tailed)
0.021
Alpha
0.05
The exact p-value could not be computed. An approximation has been used to compute the
p-value.

Test interpretation:
H0: There is no trend in the series
Ha: There is a trend in the series
As the computed p-value is lower than the significance level alpha=0.05, one should reject the null
hypothesis H0, and accept the alternative hypothesis Ha.
The risk to reject the null hypothesis H0 while it is true is lower than 2.15%.

TREND ANALYSIS USING LINEAR REGRESSION ANALYSIS METHOD


Residual(Stationary) Predicted
Year
Month
No(Ti)
Temp(Xt) Series( )
Values( )
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29

1981
1981
1981
1981
1981
1981
1981
1981
1981
1981
1981
1981
1982
1982
1982
1982
1982
1982
1982
1982
1982
1982
1982
1982
1983
1983
1983
1983
1983

1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29

-4.99
-3.84
-2.75
3.83
11.65
13.1
16.34
14.44
10.76
5.26
-0.29
-8.34
-9.58
-4.76
0.98
4.01
9.88
12.51
17.28
16.02
10.87
6.08
3.25
-1.12
-0.47
-5.2
-0.95
3.87
10.59

-0.324
-0.250
-2.624
-0.967
1.790
-0.607
1.034
0.210
-0.007
-0.584
-1.071
-5.274
-4.914
-1.170
1.106
-0.787
0.020
-1.197
1.974
1.790
0.103
0.236
2.469
1.946
4.196
-1.610
-0.824
-0.927
0.730

-0.8547
-0.8415
-0.8283
-0.8151
-0.8019
-0.7887
-0.7755
-0.7623
-0.7491
-0.7359
-0.7227
-0.7095
-0.6963
-0.6831
-0.6699
-0.6567
-0.6435
-0.6303
-0.6171
-0.6039
-0.5907
-0.5775
-0.5643
-0.5511
-0.5379
-0.5247
-0.5115
-0.4983
-0.4851

Residuals(

)
0.531
0.591
-1.795
-0.152
2.592
0.182
1.809
0.972
0.742
0.152
-0.348
-4.564
-4.217
-0.487
1.776
-0.130
0.664
-0.566
2.591
2.394
0.694
0.814
3.034
2.497
4.734
-1.085
-0.312
-0.428
1.215

Residuals(

)
0.282
0.350
3.223
0.023
6.719
0.033
3.273
0.945
0.551
0.023
0.121
20.832
17.786
0.237
3.155
0.017
0.440
0.321
6.712
5.730
0.482
0.662
9.203
6.237
22.413
1.178
0.097
0.183
1.477

)
4290.25
4160.25
4032.25
3906.25
3782.25
3660.25
3540.25
3422.25
3306.25
3192.25
3080.25
2970.25
2862.25
2756.25
2652.25
2550.25
2450.25
2352.25
2256.25
2162.25
2070.25
1980.25
1892.25
1806.25
1722.25
1640.25
1560.25
1482.25
1406.25

30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63

1983
1983
1983
1983
1983
1983
1983
1984
1984
1984
1984
1984
1984
1984
1984
1984
1984
1984
1984
1985
1985
1985
1985
1985
1985
1985
1985
1985
1985
1985
1985
1986
1986
1986

6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3

30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63

14.08
17.33
16.17
11.33
6.48
0.08
-2.56
-5.43
-3.19
-3.61
4.92
11.55
13.57
15.25
15.37
9.87
7.8
2.41
-0.15
-10.87
-13.57
-1.65
1.43
9.82
14.03
15.95
14.96
9.19
6.75
-1.21
-7.55
-6.98
-10.25
0.33

0.373
2.024
1.940
0.563
0.636
-0.701
0.506
-0.764
0.400
-3.484
0.123
1.690
-0.137
-0.056
1.140
-0.897
1.956
1.629
2.916
-6.204
-9.980
-1.524
-3.367
-0.040
0.323
0.644
0.730
-1.577
0.906
-1.991
-4.484
-2.314
-6.660
0.456

-0.4719
-0.4587
-0.4455
-0.4323
-0.4191
-0.4059
-0.3927
-0.3795
-0.3663
-0.3531
-0.3399
-0.3267
-0.3135
-0.3003
-0.2871
-0.2739
-0.2607
-0.2475
-0.2343
-0.2211
-0.2079
-0.1947
-0.1815
-0.1683
-0.1551
-0.1419
-0.1287
-0.1155
-0.1023
-0.0891
-0.0759
-0.0627
-0.0495
-0.0363

0.845
2.482
2.385
0.996
1.055
-0.295
0.899
-0.384
0.766
-3.131
0.463
2.017
0.177
0.244
1.427
-0.623
2.217
1.877
3.151
-5.983
-9.772
-1.329
-3.185
0.128
0.478
0.786
0.858
-1.461
1.008
-1.902
-4.408
-2.251
-6.611
0.493

0.715
6.162
5.690
0.991
1.113
0.087
0.808
0.148
0.587
9.801
0.215
4.068
0.031
0.060
2.036
0.388
4.914
3.523
9.926
35.791
95.494
1.766
10.145
0.017
0.229
0.617
0.737
2.135
1.017
3.616
19.428
5.067
43.699
0.243

1332.25
1260.25
1190.25
1122.25
1056.25
992.25
930.25
870.25
812.25
756.25
702.25
650.25
600.25
552.25
506.25
462.25
420.25
380.25
342.25
306.25
272.25
240.25
210.25
182.25
156.25
132.25
110.25
90.25
72.25
56.25
42.25
30.25
20.25
12.25

64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97

1986
1986
1986
1986
1986
1986
1986
1986
1986
1987
1987
1987
1987
1987
1987
1987
1987
1987
1987
1987
1987
1988
1988
1988
1988
1988
1988
1988
1988
1988
1988
1988
1988
1989

4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1

64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97

1.91
12.36
16.1
16.47
12.77
7.55
5.92
3.93
-2.86
-13.88
-5.73
-5.32
4.32
8.2
11.85
15.09
12.19
9.35
7.29
1.1
-2.89
-0.31
-1.54
-2.59
2.99
12
15.78
17.41
14.59
11.65
4.31
-2.83
-4.71
1.98

-2.887
2.500
2.393
1.164
-1.460
-3.217
0.076
3.149
0.206
-9.214
-2.140
-5.194
-0.477
-1.660
-1.857
-0.216
-2.040
-1.417
1.446
0.319
0.176
4.356
2.050
-2.464
-1.807
2.140
2.073
2.104
0.360
0.883
-1.534
-3.611
-1.644
6.646

-0.0231
-0.0099
0.0033
0.0165
0.0297
0.0429
0.0561
0.0693
0.0825
0.0957
0.1089
0.1221
0.1353
0.1485
0.1617
0.1749
0.1881
0.2013
0.2145
0.2277
0.2409
0.2541
0.2673
0.2805
0.2937
0.3069
0.3201
0.3333
0.3465
0.3597
0.3729
0.3861
0.3993
0.4125

-2.864
2.510
2.390
1.147
-1.490
-3.260
0.020
3.080
0.124
-9.309
-2.249
-5.316
-0.612
-1.808
-2.018
-0.391
-2.228
-1.618
1.232
0.092
-0.065
4.102
1.783
-2.744
-2.100
1.833
1.753
1.770
0.013
0.524
-1.907
-3.997
-2.043
6.234

8.200
6.300
5.713
1.316
2.220
10.625
0.000
9.487
0.015
86.665
5.058
28.258
0.374
3.270
4.074
0.153
4.965
2.618
1.517
0.008
0.004
16.828
3.178
7.531
4.411
3.361
3.074
3.134
0.000
0.274
3.636
15.974
4.174
38.860

6.25
2.25
0.25
0.25
2.25
6.25
12.25
20.25
30.25
42.25
56.25
72.25
90.25
110.25
132.25
156.25
182.25
210.25
240.25
272.25
306.25
342.25
380.25
420.25
462.25
506.25
552.25
600.25
650.25
702.25
756.25
812.25
870.25
930.25

98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131

1989
1989
1989
1989
1989
1989
1989
1989
1989
1989
1989
1990
1990
1990
1990
1990
1990
1990
1990
1990
1990
1990
1990
1991
1991
1991
1991
1991
1991
1991
1991
1991
1991
1991

2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11
12
1
2
3
4
5
6
7
8
9
10
11

98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131

1.79
2.57
4.89
11.86
14.7
16.74
14.61
11.67
6.24
0.83
-4.38
-0.61
2.96
3.6
6.33
11.79
14.63
15.35
15.9
9.38
5.89
-0.18
-0.98
-2.12
-4.25
1.46
5.03
8.69
11.66
17.66
16.44
10.63
6.18
2.33

5.380
2.696
0.093
2.000
0.993
1.434
0.380
0.903
0.396
0.049
-1.314
4.056
6.550
3.726
1.533
1.930
0.923
0.044
1.670
-1.387
0.046
-0.961
2.086
2.546
-0.660
1.586
0.233
-1.170
-2.047
2.354
2.210
-0.137
0.336
1.549

0.4257
0.4389
0.4521
0.4653
0.4785
0.4917
0.5049
0.5181
0.5313
0.5445
0.5577
0.5709
0.5841
0.5973
0.6105
0.6237
0.6369
0.6501
0.6633
0.6765
0.6897
0.7029
0.7161
0.7293
0.7425
0.7557
0.7689
0.7821
0.7953
0.8085
0.8217
0.8349
0.8481
0.8613

4.954
2.257
-0.359
1.535
0.515
0.942
-0.125
0.385
-0.135
-0.495
-1.871
3.485
5.966
3.129
0.923
1.306
0.286
-0.606
1.006
-2.063
-0.644
-1.664
1.370
1.817
-1.403
0.831
-0.536
-1.952
-2.842
1.545
1.388
-0.972
-0.512
0.688

24.545
5.096
0.129
2.356
0.265
0.887
0.016
0.148
0.018
0.245
3.502
12.148
35.592
9.791
0.852
1.707
0.082
0.368
1.013
4.257
0.414
2.767
1.878
3.302
1.967
0.690
0.287
3.810
8.076
2.388
1.927
0.944
0.262
0.473

992.25
1056.25
1122.25
1190.25
1260.25
1332.25
1406.25
1482.25
1560.25
1640.25
1722.25
1806.25
1892.25
1980.25
2070.25
2162.25
2256.25
2352.25
2450.25
2550.25
2652.25
2756.25
2862.25
2970.25
3080.25
3192.25
3306.25
3422.25
3540.25
3660.25
3782.25
3906.25
4032.25
4160.25

132

1991

12

132
66.5

-0.68

2.386

0.8745

1.512
Sum
2

Sx

2.286
803.733

4290.25
191,653

6.183

3.22591E-05
0.005679713

Step 3: Is the residual series resulting from step 2 a white noise series? Is it a Gaussian time
series?
Tests for Gaussianity of the time series
Definition: If {Xt} is a Gaussian time series if all of its joint distributions are multivariate normal,
that is, if for any collection of integers i1, ,in, the random vector (Xi1 ,..,Xin)T has a multivariate
normal distribution. For the multivariate normal joint distributions, the first and second order
moments completely determine the distributions. Hence, for a Gaussian TS, the second-order
properties of the TS give its complete characterisation (model). To test whether the residual series a
Gaussian series, we carry out tests to establish whether the time series are normally distributed. In
the next sections, the histogram plots, Chi-Square test and Kolmogorov Smirnov test are
undertaken to determine whether the series fits normal distribution.

Descriptive statistics for the intervals :

Lower bound
-10
-8.235369042
-6.470738084
-4.706107126
-2.941476167
-1.176845209
0.587785749
2.352416707
4.117047665
5.881678623

Upper bound
-8.235369042
-6.470738084
-4.706107126
-2.941476167
-1.176845209
0.587785749
2.352416707
4.117047665
5.881678623
7.646309582

Frequency
2
1
4
5
21
47
36
11
3
2

Relative
frequency
0.015
0.008
0.030
0.038
0.159
0.356
0.273
0.083
0.023
0.015

Density (Data)
0.009
0.004
0.017
0.021
0.090
0.202
0.155
0.047
0.013
0.009

Density
(Distribution)
0.001
0.005
0.026
0.091
0.198
0.271
0.232
0.125
0.042
0.009

Testing whether the series is Gaussian: The Histogram plot test

Histogram
45
40
35

Frequency

30
25
20

Frequency

15
10
5
0
-9.98 -8.47 -6.96 -5.45 -3.93 -2.42 -0.91 0.60 2.11 3.62 5.13 More
Bin

Observed and theoretical frequencies


50
45
40

Frequency

35
30
25
20
15
10
5
0
1

10

Class
Observations

Distribution

Histograms
0.25

Density

0.2

0.15

0.1

0.05

0
-10

-8

-6

-4

-2

Residual Series(T-Tbar) -Stationary Series


Residual Series(T-Tbar) -Stationary Series

Normal(0.007,2.528)

10

Cumulative distributions
1
0.9

Cumulative relative frequency

0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0

-10

-8

-6

-4

-2

10

Residual Series(T-Tbar) -Stationary Series


Residual Series(T-Tbar) -Stationary Series

Normal(0.007,2.528)

The plots above show that the residual time series appears to approximate a normal distribution
(Gaussian), though with more positive excess kurtosis (Leptokurtic) than the standard normal
distribution. However, these plots alone do not provide enough evidence for us to accept the
hypothesis that the data is normally distributed. We shall carry out Chi-Square and KolmogorovSmirnov tests which provide more rigorous procedures for testing the goodness of fit for given data
to an assumed distribution.
Summary Statistics
Estimated parameters:
Parameter

Value

sigma

0.007
2.528

Statistics estimated on the input data and computed using the estimated
parameters of the Normal distribution:
Statistic
Mean
Variance
Skewness (Pearson)

Data
0.007
6.388
-0.851

Parameters
0.007
6.388
0.000

Kurtosis (Pearson)

2.664

0.000

Kolmogorov-Smirnov test:
D
p-value
Alpha

0.091
0.213
0.05

Test interpretation:
H0: The sample follows a Normal distribution
Ha: The sample does not follow a Normal distribution
As the computed p-value is greater than the significance level alpha=0.05, one cannot reject the
null hypothesis H0.
The risk to reject the null hypothesis H0 while it is true is 21.34%.
Chi-square test:
Chi-square (Observed value)
Chi-square (Critical value)
DF
p-value
Alpha

68.026
14.067
7
< 0.0001
0.05

Test interpretation:
H0: The sample follows a Normal distribution
Ha: The sample does not follow a Normal distribution
As the computed p-value is lower than the significance level alpha=0.05, one should reject the null
hypothesis H0, and accept the alternative hypothesis Ha.
The risk to reject the null hypothesis H0 while it is true is lower than 0.01%.

Comparison between the observed and theoretical frequencies:

Class
1
2
3
4
5
6
7
8
9
10

Lower bound
-10.000
-8.235
-6.471
-4.706
-2.941
-1.177
0.588
2.352
4.117
5.882

Upper bound
-8.235
-6.471
-4.706
-2.941
-1.177
0.588
2.352
4.117
5.882
7.646

Frequency (Data)
2
1
4
5
21
47
36
11
3
2

Frequency
(Distribution)
0.068
0.612
3.423
11.960
26.147
35.788
30.676
16.465
5.530
1.161

Chi-square
54.606
0.246
0.097
4.050
1.013
3.513
0.924
1.814
1.158
0.606
68.026

Testing whether the residuals constitute White Noise Series

A sequence * + of uncorrelated random variables, each with zero mean and variance
white noise. It is denoted by:
* +

is called

The name white comes from the analogy with white light and indicates that all possible periodic
oscillations are present with equal strength. That is, a white noise series is a serially uncorrelated,
zero-mean, constant and finite variance series

The criterion for white noise is;


E(Xt) = 0 for all t
Var(Xt) = 2 for all t, 2 <
Analysis of the residual series shows that:
For the entire residual series;
. This could mislead us
to consider the residuals to approximate white noise series. However, consideration of the residual
series at random time lags requires computation of the autocorrelation function and testing whether
the time series do meet the requirement for white noise. In the next section, we derive the
autocorrelation function and use it to test for white noise in our series for the time lag of 1.

Estimating the Autocorrelation Function


The principle used to derive the autocorrelation function is known as the analog principle. The
analog principle, as used in statistical theory, is to estimate population moments by the analogous
sample moment, i.e., replace expected values with analogous sample averages.
For instance, since the xts are assumed to be drawn from a distribution with the same mean, , the
analog principle directs us to use the sample mean to estimate the population mean:

1 T
xt
n t 1

Similarly, to estimate
2 = Var(xt) = E[(xt-)2]
By applying the analog principle, we replace the expected value with the sample average, i.e.,

T
2
( xt )

n 1 t 1
1

The autcorrelation function at displacement is

( )

E [( xt )( xt )]
2
E [( xt ) ]

The analog principle directs us to estimate () by using its sample analog:

( )

nk

T
[( xt )( xt )
t 1
T
2
( xt )
n 1 t 1
1

( ),

= 0,1,2, is called the sample autocorrelation function or the correlogram of the

series.
Computing the Sample Autocorrelation Function for a White Noise Process
Suppose that Xt is a white noise process (i.e., Xt is a zero-mean, constant variance, and serially
uncorrelated process).
We know that the population autocorrelation function, (), will be zero for all nonzero .
What will the sample autocorrelation function look like?
For large samples,

()~N( 0,1/T)
or

T ( ) ~ N ( 0,1)
This result means that if Xt is a white noise process then for 95% of the realizations of this time
series

( )

should lie in the interval [ 2 / T , 2 / T ] for any given .

That is, for a white noise process, 95% of the time

( ) will lie within the two-standard error

band around 0, [ 2 / T , 2 / T ] .
The 2 comes in because it is approximately the 97.5 percentile of the N(0,1) distribution; the
square root of T comes in because it is the standard deviation of rho-hat.
This result allows us to check whether a particular displacement has a statistically significant
sample autocorrelation.
For example, if (1) 2 / T then we may conclude that the evidence of first-order
autocorrelation appears to be too strong for the series to be a white noise series.
From the monthly temperature residues, it is established that:

(1) 0.3798

2
132

( 0.174)

Thus we may conclude that there is evidence that the first order autocorrelation may be too strong
for the residual time series to be considered as white noise.
Bartletts Kolmogorov-Smirnov and Fishers Kappa tests:
An analysis of white noise was also undertaken using advanced Excel features to establish whether
the series is white noise or not. The results are summarized below.
Hypothesis:
Ho: Time series is white noise
Ha: Time Series in not white noise
Decision Criterion:
If the computed test statistics based on the sample data are greater than the critical values for the
respective tests, then the null hypothesis is rejected
White noise tests on (Residual Series):
Statistic
Fisher's kappa
Bartlett's Kolmogorov-Smirnov

Value
5.072
0.275

p-value
0.320
< 0.0001

Both the Fishers Kappa and Bartletts Kolmogorov-Smirnov test statistics are greater than the
critical values for the respective tests, thus we reject the hypothesis that the series is a white noise.

You might also like