0% found this document useful (0 votes)
27 views11 pages

Time Series Mid Term-1

Uploaded by

Abdullah Khalid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views11 pages

Time Series Mid Term-1

Uploaded by

Abdullah Khalid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Time Series

A time series is a sequence of data points collected or recorded at regular intervals over time. The key
feature of time series data is that the order of the data points matters, as it reflects changes in the
observed variable over time.

Examples of Time Series:

1. Stock Prices: Daily closing prices of a stock on the stock market.

2. Weather Data: Daily temperatures recorded over a year.

3. Sales Data: Monthly sales revenue of a retail store.

4. Economic Indicators: Quarterly GDP growth or unemployment rates over several years.

5. Electricity Consumption: Hourly electricity usage in a household or business.

6. Traffic Data: Number of cars passing a toll booth every minute.

7. Website Traffic: Number of visitors to a website measured every day or every hour.

8. Exchange Rates: Daily exchange rates between two currencies, like USD to EUR.

9. Heart Rate Data: A person's heart rate measured at regular intervals, such as every second.

10. Birth Rates: Annual birth rates in a specific country over several decades.

Why we need separate analysis for time series?

We need a separate analysis for time series data because the data points are dependent on the order in
which they are collected over time. Unlike regular (cross-sectional) data where observations are
independent, time series data has specific characteristics, such as trends, seasonality, cycles, and time-
based dependencies, which require specialized analytical techniques.

Reasons for Separate Time Series Analysis:

1. Temporal Dependency: In time series data, the current value is often dependent on previous
values. For example, in stock prices, today's price might depend on the price of the previous day.

o Example: Predicting tomorrow's stock price requires considering the trend and past
prices, which a regular regression model wouldn’t account for properly.

2. Trend Detection: Time series analysis helps to identify long-term trends in the data, which might
be increasing, decreasing, or stable over time.

o Example: Analyzing global temperatures over decades helps to identify climate change
trends.

3. Seasonality: Many time series have recurring patterns or cycles that repeat over fixed intervals,
such as daily, monthly, or yearly.
o Example: Retail stores experience a spike in sales during the holiday season every year.
Time series models can capture this seasonal pattern.

4. Stationarity: Time series data often needs to be stationary (constant mean and variance over
time) for proper modeling. Special transformations or differencing are applied to make the data
stationary, unlike regular data analysis.

o Example: Analyzing inflation rates requires converting non-stationary data into a


stationary form before accurate predictions can be made.

5. Autocorrelation: Time series analysis accounts for the autocorrelation (the correlation of a
variable with its own past values), which regular analysis methods ignore.

o Example: In electricity demand forecasting, today's usage may be highly correlated with
the previous day's usage, and ignoring this autocorrelation can lead to inaccurate
predictions.

6. Forecasting: Time series analysis is particularly suited for forecasting future values based on
historical data, as it considers the temporal structure of the data.

o Example: Forecasting next year's sales for a company based on past sales data requires
time series models like ARIMA, which specifically handle time-based patterns.

7. Noise Filtering: Time series analysis techniques can help filter out random noise to reveal the
true underlying pattern.

o Example: In heart rate monitoring, random spikes in the data can be smoothed to
understand the actual trend in a patient’s health over time.

Why Regular Analysis Doesn’t Work:

• Regular regression or classification models assume that the observations are independent of
each other, which isn’t the case in time series.

• Standard models ignore patterns like trend, seasonality, and autocorrelation, which are essential
for accurate analysis and forecasting in time series data.

Components of Time Series with Examples

A time series is a collection of data points measured at regular intervals over time. Its analysis helps in
understanding trends and making forecasts. The key components of a time series are:

1. Trend

• Definition: The overall, long-term movement in the time series data, either increasing,
decreasing, or flat.

• Example: The steady increase in global average temperature over the last century.
• Linear Trends: When the increase or decrease happens at a constant rate.

• Non-Linear Trends: Growth or decline at a varying rate over time.

• Global Examples: Population growth, stock market index growth, GDP growth.

• Local Examples: Increasing sales revenue for a growing business or decreasing traffic in a
shrinking town.

• Types of Trends: Upward, downward, and constant trends, observed over years or decades.

• Smoothing Methods: Moving averages and exponential smoothing techniques can be used to
observe the underlying trend in noisy data.

2. Seasonality

• Definition: Patterns or fluctuations that repeat over a specific period (e.g., daily, monthly,
yearly).

• Example: Increased hotel bookings during summer vacation or holiday shopping season.

• Fixed Intervals: Seasonal patterns occur at fixed, known intervals, such as weekly, monthly, or
annually.

• Economic Seasonality: Tax filing spikes during tax season, or increased toy sales during
Christmas.

• Cultural Seasonality: Changes in tourism related to holidays like New Year's or religious festivals.

• Environmental Seasonality: Agricultural harvest patterns or seasonal demand for winter clothes.

• Models: SARIMA models and seasonal decomposition techniques handle seasonality in data.

• Adjusting for Seasonality: By removing seasonal effects, analysts can focus on the trend and
irregular components.

3. Cyclicality

• Definition: Long-term fluctuations that do not follow a fixed, regular interval but repeat over
time.

• Example: Economic cycles of expansion and contraction over periods of 3-10 years.

• Difference from Seasonality: Cycles don’t have a fixed duration and can span many years, unlike
seasonality, which has a set period.

• Economic Cycles: Boom and bust cycles, such as the dot-com bubble or housing market crash.

• Business Cycles: Companies may have periods of high and low profitability, often tied to broader
market trends.

• Stock Market Cycles: Bull and bear markets that reflect overall investor sentiment and
macroeconomic conditions.
• Factors Influencing Cycles: Inflation, interest rates, government policy, and technological
innovation.

• Cyclic Analysis: Methods like the Hodrick-Prescott filter help in isolating cycles from trend
components.

4. Irregular (Random) Component

• Definition: Unpredictable, random fluctuations in time series data that cannot be explained by
trend, seasonality, or cyclicality.

• Example: Sudden spikes in stock prices due to an unexpected event, like a natural disaster or a
global pandemic.

• Short-Term Fluctuations: Temporary changes in data, such as a sudden increase in demand for a
product due to media coverage.

• Noise in Data: These variations are often considered noise in time series analysis and are not
part of the overall pattern.

• Exogenous Events: Events outside of normal cycles, such as political events, new regulations, or
unforeseen company-specific news.

• Financial Irregularity: Stock price fluctuations due to rumors or market speculation.

• Impact on Forecasting: Irregular components make forecasting more difficult, as they introduce
random variability.

• Residual Analysis: After modeling trend, seasonality, and cycles, the remaining data (residuals)
represent the irregular component.

Advantages of Time Series Analysis

1. Captures Temporal Patterns: It identifies trends, seasonality, and cycles in data over time,
providing insights into patterns that occur at regular intervals.

o Example: Recognizing seasonal sales increases during the holiday season.

2. Improved Forecasting: Time series models, such as ARIMA, enable accurate forecasting by
utilizing historical data to predict future values.

o Example: Forecasting next year's retail sales based on past performance.

3. Handles Seasonality: Time series analysis can adjust for seasonal effects, improving the accuracy
of forecasts.

o Example: Modeling quarterly sales by accounting for holiday season spikes.

4. Identifies Long-Term Trends: Helps to separate short-term fluctuations from long-term trends,
which can be crucial for strategic planning.

o Example: Identifying a steady increase in global temperatures over decades.


5. Data Smoothing: Techniques like moving averages smooth out short-term fluctuations, making it
easier to observe overall trends.

o Example: Smoothing stock price data to reveal the underlying market trend.

Disadvantages of Time Series Analysis

1. Requires Large Datasets: Time series analysis often needs a substantial amount of historical data
to identify reliable patterns.

o Example: A short time series of sales data may not be sufficient to identify long-term
trends.

2. Sensitive to Missing Data: Time series models can be affected by gaps in data, requiring
imputation or special handling of missing values.

o Example: Missing days in a weather dataset can impact forecast accuracy.

3. Assumes Continuity of Past Patterns: The assumption that past patterns will continue into the
future may not hold true in dynamic environments.

o Example: Predicting sales growth using pre-pandemic data could be inaccurate post-
pandemic.

4. Complex Model Selection: Choosing and tuning the right model, such as ARIMA or SARIMA, can
be complex and may require specialized knowledge.

o Example: Determining the appropriate parameters for an ARIMA model (p, d, q) can be
difficult.

5. Vulnerable to Sudden Changes: Time series models may struggle to accommodate sudden,
unpredictable events or outliers.

o Example: The 2008 financial crisis led to a dramatic change in economic trends, making
forecasts based on prior data inaccurate.

Classification of Time Series

Time series can be classified into several categories based on various characteristics. Here’s a detailed
classification focusing on discrete, continuous, deterministic, stochastic, and other relevant types:

1. Based on Nature of Time Points

Discrete Time Series

• Definition: Data points are recorded at distinct, separate intervals (e.g., daily, monthly,
quarterly).

• Example: Monthly unemployment rates or daily stock prices.


Continuous Time Series

• Definition: Data points are recorded continuously over time, capturing every moment without
interruption.

• Example: Real-time temperature readings or stock prices monitored every second.

2. Based on Predictability

Deterministic Time Series

• Definition: Future values can be predicted with certainty based on the known past values. These
series follow a specific pattern or function.

• Example: A time series that follows a mathematical equation, such as a linear trend (e.g.,
y=mx+by = mx + by=mx+b).

Stochastic Time Series

• Definition: Future values are influenced by random variables, making them inherently
unpredictable. These series contain an element of randomness.

• Example: Daily stock prices that fluctuate due to market conditions and investor behavior.

3. Based on Stationarity

Stationary Time Series

• Definition: The statistical properties (mean, variance, autocorrelation) do not change over time.
Stationary series are easier to model and analyze.

• Example: A series of fluctuations around a constant mean (like white noise).

Non-Stationary Time Series

• Definition: The statistical properties change over time, often exhibiting trends or seasonality.
Non-stationary data may require transformation to achieve stationarity.

• Example: Economic indicators, such as GDP growth, which often show long-term upward or
downward trends.

4. Based on Trends and Patterns

Trend Time Series

• Definition: Exhibits a long-term upward or downward movement over time.

• Example: The gradual increase in global temperatures over decades.

Seasonal Time Series

• Definition: Displays regular fluctuations that repeat at specific intervals (daily, monthly, or
yearly).

• Example: Retail sales that increase during holiday seasons each year.
Cyclical Time Series

• Definition: Shows fluctuations that occur at irregular intervals, often related to economic cycles
and not fixed periods.

• Example: Business cycles that show periods of economic expansion and contraction over several
years.

Irregular Time Series

• Definition: Contains random variations or noise that cannot be attributed to trend or


seasonality.

• Example: Stock prices affected by sudden news or market events.

5. Based on Length of Observation

Short-Term Time Series

• Definition: Data collected over a short period, often used for immediate forecasting and
decision-making.

• Example: Daily sales data for a retail store over a month.

Long-Term Time Series

• Definition: Data collected over an extended period, used for long-term forecasting and trend
analysis.

• Example: Annual population growth data over several decades.

6. Based on Dimensionality

Univariate Time Series

• Definition: Consists of a single variable observed over time.

• Example: Daily closing prices of a single stock.

Multivariate Time Series

• Definition: Involves multiple variables observed simultaneously over time, allowing for the
analysis of relationships between different time series.

• Example: Monthly sales data for different products alongside advertising expenditure.

Non-Stationary Time Series

A non-stationary time series is a type of time series data whose statistical properties, such as mean,
variance, and autocorrelation, change over time. In other words, the behavior of the time series data
varies with time, making it more challenging to analyze and model. Non-stationarity can arise from
various factors, including trends, seasonality, and structural changes in the data.
Characteristics of Non-Stationary Time Series

1. Changing Mean: The average value of the series varies over time.

o Example: Monthly sales data that shows increasing trends over several years.

2. Changing Variance: The variability in the data changes over time, often exhibiting patterns like
volatility clustering.

o Example: Stock prices may show increasing volatility during certain periods.

3. Seasonal Effects: Seasonal patterns may be present, causing fluctuations that repeat over fixed
intervals.

o Example: Retail sales often peak during holiday seasons.

4. Trends: Long-term upward or downward movements may exist in the data.

o Example: Global temperature data showing a consistent rise over decades.

Types of Non-Stationarity

1. Trend Non-Stationarity: The mean of the series changes over time, often due to a long-term
upward or downward trend.

o Example: A time series representing the annual GDP of a country that consistently
grows.

2. Seasonal Non-Stationarity: The series exhibits seasonal patterns that repeat over time, affecting
the mean and variance.

o Example: Monthly electricity consumption showing higher values during summer


months.

Kinetic Models

Kinetic models focus on the rates of change of a system and how these rates affect the system's state.
They are often used in contexts where the time-dependent behavior of particles, molecules, or agents is
of interest.

Characteristics of Kinetic Models:

1. Rate-Based: Kinetic models describe how quickly a system evolves over time.

2. Dynamic Interactions: They often involve interactions between different components or agents
in a system.

3. Mathematical Formulation: Typically involve differential equations that express the rates of
change.
Dynamic Models

Dynamic models focus on how a system changes over time and can incorporate both state and time-
dependent variables. They are used in various disciplines to simulate the behavior of complex systems.

Characteristics of Dynamic Models:

1. Time-Dependent: They consider how the state of the system evolves over time.

2. State Variables: Dynamic models often include multiple state variables that represent the
system's characteristics.

3. Equations of Motion: These models often use differential equations to describe changes in state
variables.

Classical Decomposition Model

The classical decomposition model separates a time series into its fundamental components:

• Trend (T): The long-term direction in the data.

• Seasonal (S): Regular fluctuations that occur at specific intervals (e.g., yearly, quarterly).

• Cyclical (C): Irregular fluctuations related to business cycles, which are not fixed in duration.

• Irregular (I): Random noise or residuals that cannot be attributed to trend, seasonal, or cyclical
components.

Mathematical Representation:

The time series can be expressed as:

• Additive Model:

Yt=Tt+St+Ct+It

• Multiplicative Model:

Yt=Tt×St×Ct×It

Stationary time series

A stationary time series is one whose statistical properties do not change over time. In other words, the
mean, variance, and autocovariance of the series remain constant, regardless of the time period in which
the data is observed. Stationarity is a critical assumption in many time series analysis techniques,
including ARIMA and other modeling approaches.

Characteristics of Stationary Time Series

1. Constant Mean: The average value of the time series remains constant over time.

o Example: A time series of daily temperature readings that fluctuate around a stable
mean.
2. Constant Variance: The variability or dispersion of the data points around the mean is consistent
over time.

o Example: Monthly sales data that shows similar fluctuations around the average sales
figure throughout the year.

3. Autocovariance: The covariance of the time series with its lagged values is invariant to time
shifts.

o Example: The relationship between sales in one month and sales in the previous month
remains the same regardless of the specific months being analyzed.

Strict Stationarity

Strict Stationarity is a concept in time series analysis where the statistical properties of the series remain
completely unchanged over time. In simpler terms, a strictly stationary time series behaves exactly the
same, no matter when you observe it. The joint distribution (probability structure) of any subset of the
time series values does not vary, regardless of when that subset is observed.

Key Ideas of Strict Stationarity:

1. Unchanging Behavior Over Time:

o If you shift the time series forward or backward in time, its statistical properties (like
mean, variance, and distribution) remain exactly the same. This means you cannot tell
whether you're looking at data from yesterday, last year, or any other point in time just
by examining the data.

2. All Moments Stay the Same:

o Not only are the mean and variance constant over time, but also higher-order moments,
like skewness and kurtosis (measures of the shape of the data distribution), must also
stay constant.

3. Focus on the Entire Distribution:

o Strict stationarity deals with the entire joint probability distribution of the data, which is
a stronger requirement than just focusing on the mean and variance.

Example:

• White Noise: A classic example of a strictly stationary series is white noise, where each data
point is independent and identically distributed, usually with a constant mean (e.g., 0) and
variance. The randomness is uniform over time.

Simplified Comparison:

• Strict Stationarity means the series looks identical, no matter when you observe it. This applies
to every aspect of the data.
• In contrast, Weak Stationarity only requires that the mean, variance, and autocovariance
(relationship between current and past values) remain constant over time. It doesn’t demand
the entire probability distribution to remain unchanged, which is often a more practical
assumption in real-world data.

Why It’s Important:

Strict stationarity provides a very strong form of consistency in data, making it easier to model and
predict time series behaviors, but real-world data often does not meet this strict condition due to trends,
seasonality, or changing variance. Hence, many models rely on weaker forms of stationarity.

Normal white noise

Normal white noise is a type of time series where each data point is an independent and identically
distributed (i.i.d.) random variable drawn from a normal (Gaussian) distribution with a mean of zero and
a constant variance. In this context, "white" refers to the idea that the data points are uncorrelated with
each other (no predictable pattern), and "noise" refers to the random variation inherent in the series.

You might also like