0% found this document useful (0 votes)
16 views

Module 4_ Time Series Analysis

The document provides an overview of business forecasting techniques, focusing on smoothing and extrapolation of time series data. It discusses various methods such as moving averages, exponential smoothing, and seasonal adjustment techniques like X-12-ARIMA and STL. Additionally, it covers properties of stochastic time series, linear time series models, and ARIMA models for non-stationary processes.

Uploaded by

Volt Carnage
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Module 4_ Time Series Analysis

The document provides an overview of business forecasting techniques, focusing on smoothing and extrapolation of time series data. It discusses various methods such as moving averages, exponential smoothing, and seasonal adjustment techniques like X-12-ARIMA and STL. Additionally, it covers properties of stochastic time series, linear time series models, and ARIMA models for non-stationary processes.

Uploaded by

Volt Carnage
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Business Forecasting - 4th Sem MBA Notes

Prof. Chitranka K, ch.E, chart PR, ChartMktr

Smoothing and Extrapolation of Time Series

Smoothing

Smoothing techniques aim to reduce noise in time series data, making it easier to
identify underlying trends and patterns.

Common Smoothing Techniques:

● Moving Averages:
○ Simple Moving Average (SMA): This technique calculates the average of a
specified number of past observations. For example, if you have monthly
sales data, a 3-month SMA for January would be the average of sales in
November, December, and January.
■ Example: If sales for November, December, and January are 100,
120, and 140, respectively, the 3-month SMA for January would be:

● Exponential Moving Average (EMA): This method gives more weight to recent
observations, making it more responsive to changes.
○ Example: Using a smoothing factor (α) of 0.2, if the previous EMA was
120, the current value is 140:
● Exponential Smoothing: This method allows for forecasting future values based
on past observations, adjusting for trends and seasonality.
○ Example: If you have a sales series with a recent trend, you can apply
Holt’s linear trend method to forecast future sales by considering both the
level and the trend.

Extrapolation

Extrapolation involves predicting future values based on historical data.

Linear Extrapolation: This method fits a straight line to historical data and extends it to
predict future values.

Polynomial Extrapolation: When the data shows a non-linear trend, polynomial


regression can be used.

Seasonal Adjustment

Seasonal adjustment is the process of removing seasonal effects from time series data,
allowing for a clearer view of underlying trends and cycles.

Common Methods:
● X-12-ARIMA: This method adjusts for seasonality by decomposing the time
series into seasonal, trend, and irregular components. It is widely used in
economic data.
○ Example: Monthly retail sales data may show a consistent increase in
December due to holiday shopping. X-12-ARIMA adjusts for this seasonal
effect to reveal the underlying trend.
● STL (Seasonal and Trend decomposition using Loess): This method decomposes
a time series into seasonal, trend, and remainder components using local
regression.
○ Example: If you have monthly temperature data, STL can help separate the
seasonal variations (e.g., higher temperatures in summer) from the overall
trend (e.g., increasing average temperatures over decades).

Properties of Stochastic Time Series

Understanding the properties of stochastic time series is essential for modeling and
forecasting.

Autocorrelation Function (ACF)

● The ACF measures the correlation between observations at different lags. It


helps identify patterns and the presence of seasonality.
○ Example: If you have daily temperature data, the ACF can show whether
today’s temperature is correlated with temperatures from previous days
(e.g., a strong correlation with yesterday’s temperature).

Stationarity

● A stationary time series has constant mean and variance over time.
Non-stationary series often need to be transformed (e.g., differencing) to achieve
stationarity.
○ Example: A time series of stock prices is often non-stationary due to
trends. Differencing the data (subtracting the previous observation from
the current one) can help achieve stationarity.

Random Walk
● A random walk is a non-stationary process where the current value is a random
step from the previous value. It implies that future values cannot be predicted
based on past values.
○ Example: Stock prices often follow a random walk, where today's price is
influenced by yesterday's price plus a random shock.

Cointegrated Time Series

● When two or more non-stationary series move together over time, they are said to
be cointegrated, indicating a long-term equilibrium relationship.
○ Example: The prices of two related stocks may drift apart in the short term
but tend to revert to a long-term relationship, indicating cointegration.

Linear Time Series Models

Linear time series models are foundational for analyzing time series data.

Moving Average (MA) Models

● These models express the current value as a function of past error terms.
○ Example: An MA(1) model uses the previous period's error to predict the
current value:

Autoregressive (AR) Models

● These models express the current value as a function of its past values.
○ Example: An AR(1) model predicts the current value based on its previous
value:
Mixed Autoregressive and Moving Average (ARMA) Models

● ARMA models combine AR and MA components to model time series data.


○ Example: An ARMA(1,1) model combines one lag of the dependent
variable and one lag of the error term:

Homogeneous Non-Stationary Processes

Non-stationary processes require special modeling approaches.

ARIMA Models (AutoRegressive Integrated Moving Average)

● ARIMA models are used for non-stationary time series that can be made
stationary through differencing. The model is characterized by three parameters:
(p, d, q).
○ Example: An ARIMA(1,1,1) model indicates one autoregressive term, one
differencing operation, and one moving average term.

Box-Jenkins Methodology

● A systematic approach for identifying, estimating, and diagnosing ARIMA


models. It involves:
i. Identification: Analyzing ACF and PACF plots to determine p and q.
ii. Estimation: Fitting the model to the data.
iii. Diagnosis: Checking residuals for randomness.
Specification of ARIMA Models

● This involves determining the appropriate values of p, d, and q based on the data
characteristics.
○ Example: If the ACF shows a cutoff after lag 2, you might choose p=2. If
the series is non-stationary, d might be 1.

SARIMA (Seasonal ARIMA)

● Extends ARIMA to capture seasonality in time series data. It adds seasonal


parameters (P, D, Q) to the model.
○ Example: A SARIMA(1,1,1)(1,1,1)[12] model indicates one non-seasonal
AR term, one non-seasonal differencing, one non-seasonal MA term, and
one seasonal AR, seasonal differencing, and seasonal MA term with a
period of 12 (e.g., monthly data).

ARMAX (Autoregressive Moving Average with Exogenous Inputs)

● Incorporates external variables into the ARMA framework, allowing for the
modeling of relationships with other time series.
○ Example: If you are modeling sales based on past sales data and
advertising spend, the ARMAX model would include both the AR and MA
components along with the advertising variable.

You might also like