On Smooth Transition Autoregressive Family of Models and Their Applications-Asif

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

ON SMOOTH TRANSITION AUTOREGRESSIVE (STAR)

MODELS AND THEIR APPLICATIONS: AN OVERVIEW


MIR ASIF IQUEBAL
Ph.D. (Agricultural Statistics), Roll No. 9068
I.A.S.R.I., Library Avenue, New Delhi-110012

Chairperson: Dr. Prajneshu

Abstract: One of the most important family of nonlinear time-series models, capable of
exhibiting limit cycle behaviour, is Self-exciting threshold autoregressive (SETAR)
models. However, one limitation of this family is that transitions between various regimes
take place in a discontinuous and sudden manner. For a more realistic modeling, these
transitions should be smooth. Here, a flexible family of nonlinear time-series models, viz.
Smooth transition autoregressive (STAR) family of nonlinear time-series models is
thoroughly studied. Various models of this family, viz. Exponential autoregressive (EAR)
model, Exponential smooth transition autoregressive (ESTAR) model, Logistic smooth
transition autoregressive (LSTAR) model, along with their estimation procedures are
described in detail. Finally, two examples of fitting these models to real data are discussed.

Key words: Smooth Transition Autoregressive Models, Nonlinearity,Univariate Time-


Series Modeling, Regimes, Linearity Testing.

1. Introduction
Over the last one decade or so, interest in applications of nonlinear time-series models has
been steadily increasing. Models which allow for state-dependent or regime-switching
behaviour have been most popular. For example, changes in government policy may
instigate a change in regime. With a view to modelling this type of time-series data, the
family of Smooth transition autoregressive (STAR) models has been proposed by
Terasvirta (1994). A survey of recent developments related to one of these regime-
switching models, viz. the STAR nonlinear time-series models are presented here. The
data-generating process to be modeled is viewed as a linear process that switches between
a number of regimes according to some rule. It has been assumed that there is a continuum
of switches, that is, there is a smooth transition from one extreme regime to the other. It
consists of specification, estimation and evaluation stages and, thus, is similar to the
modelling cycle for Box-Jenkins linear models. Some models of STAR family are briefly
considered below.

2. Smooth Transition Autoregressive (STAR) Model


The smooth transition autoregressive (STAR) model is defined as follows:
y t = G (z t , y t ; ψ ) = φ′w t + (θ′w t ) G Lk (γ, c; y t − d ) + ε t , …(2.1)

( )
where {ε t } is a sequence of normal 0, σ 2 independent errors, φ = (φ0 , φ1 , . . ., φp ) and

θ = (θ 0 , θ1 , . . ., θ p ) are (p +1)×1 parameter vector, w t = (1, y t − 1 , . . ., y t − p ) is the vector of


′ ′

( )
consisting of an intercept and the first p lags of y t and G Lk γ, c; y t − d is known as transition
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

function. Depending upon the forms of transition function, different forms of STAR
models are defined.

2.1 Exponential Autoregressive (EAR) Model


If the transition function is of the following form, the model is known as exponential
autoregressive (EAR) model
G Lk (γ, c; y t − d ) = exp {− γ y 2t −1 }, γ > 0 …(2.2)

The function G Lk is symmetric around zero, where it obtains the value one, and
G Lk (γ, c; y t − d ) → 0 as y t −1 → ∞ . In case of EAR model, parameter vectors do not contain
intercept terms, i. e. φ0 = θ0 = 0 . Equation (2.1) indicates that the model can be interpreted
as a linear autoregressive model with stochastic time-varying coefficients φ +θ G Lk (γ, y t − d ) .
(
When γ → 0 or γ → ∞ , the model becomes linear. In the latter case, G Lk γ, c; y t − d = 0 , )
except when y t −1 = 0 . A heartening feature of this model is that it is capable of generating
limit cycles.

2.2 Exponential Smooth Transition Autoregressive (ESTAR) Model


The EAR model may be generalized by allowing an intercept φ 0 ≠ 0 or θ0 ≠ 0 or both.
Another generalization is to drop the requirement of symmetry of the transition function
(2.2) around zero by adding a location parameter c and to allow the delay d ≥ 1 . The
purpose of the generalization is to make the EAR model location invariant. Thus, the
function GLk becomes

{ }
G Lk (γ, c, y t − d ) =1− exp − γ (y t − d − c ) 2 , γ > 0 …(2.3)
Terasvirta (1994) called this model as Exponential smooth transition autoregressive
(ESTAR) model and discussed procedure for estimation of parameters. It has the property
that the minimum value of the transition function equals zero (Figure 2.1). It has been
successfully used to model macroeconomic series, such as strongly fluctuating inflation
series (Baharumshah and Liew, 2006).

Figure 2.1: Graph of exponential transition function (2.3) for γ = 0.01, 3, 20, and 50 .

2
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

The graph corresponding to the lowest value of γ lies closest to the horizontal axis G Lk = 0

2.3 Logistic Smooth Transition Autoregressive (LSTAR) Model


This model is defined by Equation (2.3), where the transition function is the logistic
function
K
G Lk (γ, c; y t − d ) = [1+ exp{− γ ∏ ( y t − d − ci )}] −1 , γ > 0 …(2.4)
i =1

In Equation (2.4), parameter γ is the slope parameter and c = (c1 , c 2 , . . ., c k ) is vector of
location parameters c1 ≤ c 2 ≤ ... ≤ c k . These restrictions, as well as restricting γ to be
positive, are needed to identify the model. The transition function is a bounded function of
yt-d, continuous everywhere in the parameter space for any value of yt-d. The most common
choice for k in Equation (2.4) is: k = 1 and k = 2. Setting k = 1 yields the standard logistic
function. The former case, illustrated in Figure 2.2, shows that the transition function
increases monotonically from zero to unity with y t − d . The LSTAR model may thus be
applied, for example, for modelling asymmetric businesss cycles because the dynamics of
the model are different in the expansion from the recession. In this case, the parameters
φ + θ G E (γ, c, y t −d ) change monotonically as a function of y t −d from φ to φ + θ . For k = 2,
they change symmetrically around the mid-point (c1 + c 2 ) 2 , where the logistic function
attains its minimum value (Figure 2.3). The minimum lies between zero and 1 2 . It reaches
zero when γ → ∞ and equals 1 2 when c1 = c 2 and γ < ∞ . When γ = 0 , the transition
function G Lk (γ, c, y t − d ) ≡ 1 2 , in which case the LSTAR model becomes a linear model.
This is clearly seen from the Figure 2.3. The dynamics of the LSTAR (2.2) model are
therefore similar both for high and low values of the transition variable and different in the
middle. When k = 1 and γ → ∞ , the LSTAR model approaches to SETAR model with
σ1 = σ 2 . When k = 2, c1 ≠ c 2 and γ → ∞ , the LSTAR model approaches to SETAR model
with three regimes such that outer regimes are identical and the mid-regime different from
the other two (Terasvirta et al., 2005).

Figure 2.2: Graphs of the logistic function (2.4) with k = 1 for γ = 0.01, 3, 20, and 50 . The
graph corresponds to the lowest value of γ lying closest to the line G Lk = 1 .
2

3
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

Figure 2.3: Graphs of the logistic function (4) with k = 2 for γ = 0.01, 3, 20, and 50 . The
graph corresponds to the lowest value of γ lying closest to the line G Lk = 1 .
2

3. Testing of Linearity and Model Selection


The strategy for building a STAR model involves three steps. First, carry out the complete
specification of a linear AR(p) model. The maximum value of the lag p has to be
determined from the data. Second, test linearity for different values of the delay parameter
d. If linearity is rejected for more than one value of d, choose the one for which the p-value
of the test is the lowest. Testing the null hypothesis H 0 : γ = 0 in (2.1) with either (2.2) or
(2.3), assuming that yt is stationary and ergodic under H 0 , is a non-standard testing
problem since (2.1) is only identified under the alternative H1 : γ ≠ 0 . To solve the problem,
Terasvista (1994) followed firstly Davies’ procedure where an auxiliary regression, with
the unidentified values kept fixed, is used to derive a Lagrange multiplier-type test that has
an asymptotic χ 2 -distribution and, secondly, the Luukkonens’ approach (Arango and
Gonazalez, 2001) in which transition function in (2.1) is replaced by its third-order Taylor
approximation. Therefore, the problem is solved by estimating the artificial regression:
( )
p
y t = π 00 + ∑ π 0 j y t − j + π1 j y t − j y t − d + π 2 j y t − j y 2t − d + π 3 j y t − j y 3t − d + ε t …(3.1)
j =1

and then testing the null hypothesis H 0 : π1 j = π 2 j = π 3 j = 0 , (j =1,...,p), against the


alternative that H 0 is not valid. In practice, the Lagrange multiplier-type test of linearity is
replaced by an F-test in order to improve the size and power of the test. Third, consider the
value of d as given and use a sequence of tests nested in (3.1) to choose between ESTAR
and LSTAR models. Such a sequence is:
H 03 : π 3 j = 0 , j = 1,..., p. …(3.2)
H 02 : π 2 j = 0 π 3 j = 0 , j = 1,..., p. …(3.3)
H 01 : π1 j = 0 π 2 j = π 3 j = 0 , j = 1,..., p. …(3.4)

4
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

It is based on the relationship between the parameters in (3.1) and (2.1) with either (2.3) or
(2.4). For the ESTAR model π 3 j = 0 , j = 1,...., p, but π2 j = 0 for at least one j if θ j ≠ 0 . For
the LSTAR model π1j ≠ 0 for at least one j if θ j ≠ 0 . If H 03 is rejected, a LSTAR model is
selected. If H 03 is accepted and H 02 is rejected then an ESTAR model is selected. If H 03
and H 02 are accepted but H 01 is rejected a LSTAR model is selected. No clear-cut
conclusion is obtained when H 02 and H 01 are rejected. In this case we test:

H ′02 = π 2 j = 0 π1 j = π 3 j = 0 , j = 1,...,p.

However, if H 02 is rejected, then H ′02 should be rejected even more strongly. In any case,
the decision is based on whether H 03 , H 02 or H 01 is rejected more strongly. Teräsvirta
(1994) found that the selection procedure works very well when the true model is LSTAR
or ESTAR; in the latter case the observations do not have to be symmetrically distributed
around c. The procedure finds it difficult to distinguish between the two types of models
when only a small number of observations are located at one of the tails of the transition
function.

4. Estimation of Star Models


After specifying the model, the parameters can be estimated. If p is known, the coefficient
can be estimated by nonlinear least squares (NLS) or maximum likelihood (ML) and some
optimization procedure (Fan and Yao, 2003). In the case where ε t ~ N (0, σ 2 ), both
methods are equivalent. Leybourne et al. (1998) pointed out that estimation can be made
more efficient by making use of the fact that, when γ and c are fixed, the models are linear
in parameters. In this case, parameters φ and θ can be estimated by Method of least
squares. Conditioning on these estimates, we can obtain next estimates for γ and c. Hence
the parameter vector of ψ of Equation (2.4) with the logistic function
K
G Lk (γ , c; y t − d ) = [1 + exp {− γ ∏ ( y t − d − c i )}] −1 , γ > 0 is estimated as
i =1
T 2
ˆ = min Q T (ψ ) = min ∑ (y t − G (z t , y t ; ψ ))
ψ
ψ ψ t =1

Under some regularity conditions the estimates are consistent and asymptotically normal,
that is T (ψˆ − ψ * ) → N (0, C ) where ψ * is the true parameter vector and C is the
covariance matrix of estimates. The parameters of the STAR model will hence be
estimated by nonlinear least squares, and to do that, a suitable iterative optimization
algorithm is needed. The optimization problem in nonlinear least squares is conditional on
the starting values y 0 , y −1 , . . ., y − p +1 and consists of finding the minimum of the criteria


q (θ ) = (y − f (y; θ )) (y − f (y; θ )) , …(4.1)
with respect to the parameter vector θ . To find a starting-point for the iteration procedure,
we approximate Equation (4.1) with a second-order Taylor expansion about θ n which
yields

5
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

( n )+ h′n (θ − θ n )+ 12 (θ − θ n )′ H n (θ − θ n )
q (θ ) ≅ q θ
…(4.2)
∂q (θ n )
where the gradient evaluated at θ n , h n = h (θ n ) = and the Hessian evaluated at θ n ,
∂θ
∂ 2 q (θ n ) ∂q (θ)
H n = H (θ n ) = . The first-order conditions for the minimum, i. e. = 0 will be
∂θ∂θ ′ ∂θ
obtained by differentiating Equation (4.2) with respect to θ and the basis for iteration
yields:
h n + H n (θ − θ n ) = 0 …(4.3)

If we have fixed θ n and know how to compute H n and h n , Equation (4.3) yields the
search direction for the next value of θ : θ = θ n +1 which is obtained from θ = θ n + k n H n−1 h n
where k n is the step-length. The value forms the starting-point of the next iteration. This is
the Newton-Raphson method. In general the optimization algorithm is very sensitive to the
choice of the starting values of the parameters. The implemented program is estimating all
the models according to the Levenberg-Marquardt algorithm with cubic interpolation linear
search. Concerning the selection of the starting values, the following algorithm is used:

Rewrite model (2.1) as y = Wλ + e , where


~ = G L (γ, c; y ) w
y ′ = [y1 , y 2 , ..., y T ] , e′ = [e1 , e 2 , ..., e T ] , λ ′ = [φ, θ], w t k t −d t

⎛ w 1′ ~′ ⎞
w
⎜ 1

⎜ ... ... ⎟
⎜ ⎟
W = ⎜ ... ... ⎟
⎜ ⎟

⎜ ... . . . ⎟⎟
⎜ ~′ ⎟
⎝ w ′T w T ⎠

Once γ , d and c have been determined, the parameter vector λ can be estimated by
λˆ = (W ′ W ) W y .
−1

5. Some Applications
(a) Example 5.1: Fitting of ESTAR Model
In order to explain “Swedish business cycles”, Skalin and Terasvirta (1999) applied
ESTAR models to yearly time-series data in respect of nine macroeconomic variables, like
Employment, Industrial production, Consumption, Export, and Import for Sweden during
the period 1870 to 1988. However, here, we shall confine ourselves to discussing the
results for “Employment” (hours worked in manufacturing and mining). The most distinct
features of this time-series are a large drop in employment in the years following the First
World War and another one followed by a rapid recovery in 1930s.

6
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

Testing linearity against ESTAR constitutes the first step of the model specification stage.
In order to do that, first a linear autoregressive model for the time-series with no
autocorrelation in residual is selected. This is done by applying an appropriate model
selection criterion such as Akaike information criterion (AIC). Here all linearity tests are
carried out conditionally on d and the results are used to select delay. The value of d
associated with the test with smallest p-value is selected (Table 5.1). If none of the p-value
is sufficiently small, linearity is not rejected.
Table 5.1: Results of linearity tests

Variable p min[p(FL)] d p(F4) p(F3) p(F2) STAR

Employment 4 3.2 ×10 −10 1 0.0050 6.26 ×10 −6 2.8 ×10 −5 E

Here p is the number of lags in the linear AR model; minimum p-value, min [p(FL)], over
the delays d = 1, 2,..., p; d is corresponding delay; p-values of the test in the model
selection sequence (p(F4), p(F3), p(F2)) for the selected d; the selected model family (E =
ESTAR, L = LSTAR). The ESTAR model is selected if p(F3) is the smallest p-value of the
trio p(F4), p(F3), p(F2); otherwise the choice is the LSTAR.

The rejection of linearity against ESTAR is strongest when d = 1 as p(FL) is minimum


when d = 1. As because p(F3) is the smallest p-value of the trio p(F4), p(F3), p(F2), the
model specification tests point at the ESTAR family. The estimated ESTAR model is:
y t = − 0.98 − 3.53 y t − 1 − 9.65 y t − 2 + 6.16 y t − 3 + 3.52 y t − 4
(0.73) (2.25) (5.55) (5.26) (2.16)
+ [1.01 + 3.53 y t − 1 + 9.65 y t − 2 − 6.16 y t − 3 − 3.52 y t − 4 ]
(0.73) (2.25) (5.55) (5.26) (2.16)

× [1 − exp {−0.32 ( y t − 1 + 0.17) 2 / 0.0014}]


…(5.1)
(0.17) (0.07)
The AIC is computed as – 6.49, implying thereby that the fit is quite good. Further, roots
of characteristic polynomial for various regimes are computed in Table 5.2.
Table 5.2: Roots of characteristic polynomials for values of the transition function G
Root − 1.98 ± 2.78 i
Mid-regime (G = 0) Modulus 3.41
Period 2.9
Root − 0.72 ± 1.79 i
Regime (G = 0.7) Modulus 1.93
Period 3.2
Regime (G = 0.95) Root − 0.22 ± 0.88 i
Modulus 0.90

7
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

Period 3.5
The characteristic polynomial for G = 0 contains a pair of complex roots with a very large
modulus. This pair remains explosive for considerably large values of G. Thus the
recovery of the growth rate from a deep trough has always been quick. Graphs of fitted
ESTAR and AR models along with data are given in Figure 5.1.

Figure 5.1: Graphs of fitted ESTAR and AR models along with Employment data
(first difference of logarithm)

The ESTAR model makes two major contributions. First, it explains the big decrease in
employment in the beginning of 1920s better than the linear AR model. Second, it tracks
the data well from mid 1960s onwards where the linear model fails. Whereas the series up
to the 1960s displays little or no cyclical movements, roughly the last quarter of the
century is characterized by a prominent asymmetric cycle with a rather short period.

(b) Example 5.2: Fitting of LSTAR Model


El Nino Southern Oscillation (ENSO) is a disruption of the ocean-atmosphere system in
the tropical Pacific Ocean that has important consequences for global weather conditions.
The common-day usage of the term El Nino refers to the extensive warming of the central
and eastern Pacific Ocean. La Nina is the other phase, when the sea surface temperatures in
the central and eastern Pacific Ocean are unusually low. Together, these two natural
processes form the ENSO. It is noticed that ENSO events occur, on an average, every 4.5
years with a range of 2-10 years. For the Southern Oscillation Index (SOI), an Index
commonly used as a measure of El Nino events to describe the dynamic behaviour of the
ENSO during turbulence. The SOI was calculated from monthly or seasonal fluctuations in
air pressure difference between Tahiti and Darwin and is defined here as follows:
( PDIFF − PDIFFAVE )
SOI = 10 ×
SD ( PDIFF )

where PDIFF = Mean Sea Level Pressure (MSLP) difference between Tahiti and Darwin,
PDIFFAVE = long term average of PDIFF, and SD (PDIFF) = standard deviation of PDIFF. This
definition of SOI is given by Australian Commonwealth Bureau of Meteorology. The
monthly data from January 1876 to May 1998 are used here for modelling purpose. A

8
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

casual inspection of the data indicates possible turbulent periods and asymmetric cyclical
variations in the sense that downturns appear to occur more rapidly than the recovery and,
therefore, STAR models are particularly appropriate to describe such a data (Figure 5.2).

The Southern Oscillation Index (SOI)

40

30

20

10
Value of SOI

Apr, 1932

Jul, 1958
Apr, 1962
Jan, 1966
Oct, 1969
Jul, 1973
Apr, 1977
Jan, 1981
Oct, 1984
Jul, 1988
Apr, 1992
Jan, 1996
Oct, 1909
Jul, 1913
Apr, 1917
Jan, 1921
Oct, 1924
Jul, 1928

Jan, 1936
Oct, 1939
Jul, 1943
Apr, 1947
Jan, 1951
Oct, 1954
Jan, 1876
Oct, 1879
Jul, 1883
Apr, 1887
Jan, 1891
Oct, 1894
Jul, 1898
Apr, 1902
Jan, 1906

-10

-20

-30

-40

-50
Year

Figure 5.2: The Southern Oscillation Index (SOI)

Hall et al. (2001) results of fitting LSTAR model to above data are now discussed. Using
AIC a lag length of 13 is chosen for the linear autoregressive model. A summary of the
results of the test for nonlinearity is presented in Table 5.3. Table 5.3 contains the lag
length, p, of the autoregressive model chosen by AIC, the p-value of the test corresponding
to values of the delay parameter, d, from 1 to 5. Linearity is rejected against STAR models,
particularly with delay length of 1, 3 and 5. For these delay values, the results of the model
selection test sequence for choosing between ESTAR and LSTAR is also shown in Table
5.3 and LSTAR model is the preferred model. A number of LSTAR model with a variety
of delay variables are estimated. The selected model for this time-series is LSTAR (13)
model with d = 1. The intercept in linear part of the selected model is constrained to be
equal to that in nonlinear part of the equation. This imposes a restriction that in normal
times the estimated mean of the SOI will be zero.

Table 5.3: Results of linearity tests and model selection


Variable p d p(FL) p(F4) p(F3) p(F2) STAR
SOI 13 1 0.033 0.076 0.315 0.074 L
SOI 13 2 0.108
SOI 13 3 0.004 0.008 0.292 0.051 L
SOI 13 4 0.331
SOI 13 5 0.058 0.106 0.443 0.071 L

9
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

Here p is the number of lags in linear autoregressive model, d is the delay parameter, p(FL)
is the p-value of the linearity test; p(F2), p(F3), p(F4) are the p-values of the tests in the
model selection sequence, and the selected model family is (E = ESTAR, L = LSTAR). E
is selected if p(F3) is the smallest value of the sequence, otherwise L is selected
(Terasvirta, 1994).

The estimated LSTAR model is:

y t = −108.479 + 1.880 y t −1 + 1.073 y t − 2 + 3.175 y t − 3 − 2.313 y t − 4 −1.431 y t − 5 + 2.860 y t − 6


(2.06) (1.38) (0.77) (1.45) (−1.07) (−0.82) (1.46)
+ 1.420 y t − 7 −1.098 y t − 8 − 0.434 y t − 9 − 0.641 y t −10 + 1.424 y t −11 − 3.698 y t −12 + 2.731 y t −13
(1.27) (−0.74) (−0.31) (−0.50) (1.05) (−1.94) (1.44)
+ [ 108.479 − 1.392 y t −1 − 0.953 y t − 2 − 3.133 y t − 3 + 2.379 y t − 4 + 1.477 y t − 5 − 2.800 y t − 6
(2.06) ( −1.06) (−0.57) (−1.43) (1.12) (0.84) (−1.42)
−1.419 y t − 7 + 1.081 y t − 8 + 0.500 y t − 9 + 0.570 y t −10 −1.431 y t −11 + 3.699 y t −12 − 2.795 y t −13 ]
(−1.29) (0.73) (0.31) (0.44) (−1.15) (1.94) (−1.48)
[1 + exp {− 9.858 ( y t −1 + 27.294) / 7.769}] −1
( −2.39) (6.71)

T = 1469, AIC = 4.120, s nonlinear s linear = 0.993,


where T is the sample size; the ratio s nonlinear s linear gives an idea of the relative gain in
the fit from applying a LSTAR model instead of a linear autoregressive model. The
ratio s nonlinear s linear is quite close to unity which indicates that nonlinearity is needed to
characterize exceptional periods in the series. The nonlinear model describes the most
turbulent in the data better than the linear AR models. The estimated equation contains the
parameter restriction, φ0 = − θ0 , so that under the normal regime (G = 1) the local mean of
the process equals zero. The intercept is only contributing to the process when G < 1,
which is a rare event. The dynamic behaviour of the model is characterized in two ways.
First, by interpreting individual parameter estimates or the delay which does not give much
useful information ( γ̂ is the only exception). Computation of roots of the characteristic
polynomial at given values of transition function is more informative. The extreme values
G = 0 and G = 1 are particularly interesting. The roots of the characteristic polynomial for
G = 0 and G = 1are given in Table 5.4.

Table 5.4: Roots of characteristic polynomials for values of the transition function G
Regime G = 0 Regime G = 1
Root Modulus Period Root Modulus Period
2.62 2.62 0.90 ± 0.13 i 0.91 42.4
− 0.56 ± 1.24i 1.36 3.15 − 0.45 ± 0.71i 0.84 2.94
− 1.01 ± 0.31i 1.05 2.21 0.61 ± 0.53 i 0.81 8.78
− 0.57 ± 0.84i 1.02 2.89 − 0.73 ± 0.33 i 0.80 2.31
0.75 ± 0.67i 1.01 8.62 0.30 ± 0.73 i 0.79 5.32
0.23 ± 0.82i 0.85 4.82 − 0.02 ± 0.77 i 0.77 3.93

10
On Smooth Transition Autoregressive (STAR) Models and their Applications: An Overview

0.80 ± 0.17i 0.82 29.2 − 0.73 0.73


For G = 1, all roots are stationary, but for G = 0, there exists a real root which is greater
than unity, and a number of unstable complex roots. This implies that the LSTAR model
lacks stability and so has limited usefulness for forecasting purposes although it could still
be used for predicting few months ahead. Probably no univariate model can correctly
predict an outbreak of El Nino, because the initial shock or shocks triggering such an event
are exogenous to the univariate system. More information in the form of other, predictable,
variables would be needed to improve this performance. Once El Nino is under way,
however, the model may help predict the strength of the perturbance a few months ahead.
Thus the main gain from this modeling exercise seems to be the improved understanding
of the nonlinear dynamic behaviour of the ENSO.

6. CONCLUSIONS
STAR family of nonlinear time-series models are thoroughly studied. These types of
models are of particular importance to describe those data sets that have cyclical variations
along with turbulent periods. Application of these models to cyclical data in fisheries, lac
production/ export, etc is required. There is also a need to apply, in due course of time,
several extensions of above models, viz. Time-varying STAR models, and STAR models
for vector time-series.

References
Arango, L. E. and Gonazalez, A. (2001). Some evidence of smooth transition nonlinearity
in Colombian inflation. Applied Economics, 33. 155-162.
Baharumshah, A. Z. and Liew, V. K. (2006). Forecasting performance of exponential
smooth transition autoregressive exchange rate models. Open Economies Review,
17. 235 – 251.
Fan, J. and Yao, Q. (2003). Nonlinear time series: Nonparametric and parametric
methods. Springer, New York.
Hall, A. D., Skalin, J. and Terasvirta, T. (2001). A nonlinear time series model of El Nino.
Environmental Modelling and Software, 16. 139-146.
Leybourne, S., Newbold, P. and Vougas, D. (1998). Unit root test and smooth transitions.
Journal of Time Series Analysis, 19. 83-97.
Skalin, J. and Terasvirta, T. (1999). Another look at Swedish business cycles, 1861-1988.
Journal of Applied Econometrics, 14. 359-378.
Terasvirta, T. (1994). Specification, estimation, and evaluation of smooth transition
autoregressive models. Journal of the American Statistical Association, 89. 208-
218.
Terasvirta, T., van Dijk, D. and Medeiros, M. C. (2005). Linear models, smooth transition
autoregressions, and neural networks for forecasting macroeconomic time series: A
re-examination. International Journal of Forecasting, 21. 755-774.

11

You might also like