0% found this document useful (0 votes)
2 views8 pages

Time Series Part2

The document covers various statistical models including Linear Stationary Processes, Moving Average (MA) Models, Weighted Moving Average (WMA), Autoregressive (AR) Models, ARMA, ARIMA, and SARIMA Models. It provides definitions, characteristics, and conditions for each model, along with their applications and behaviors in time series analysis. Key concepts such as stationarity, invertibility, and the importance of seasonal components are also discussed.

Uploaded by

Debjit Bera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views8 pages

Time Series Part2

The document covers various statistical models including Linear Stationary Processes, Moving Average (MA) Models, Weighted Moving Average (WMA), Autoregressive (AR) Models, ARMA, ARIMA, and SARIMA Models. It provides definitions, characteristics, and conditions for each model, along with their applications and behaviors in time series analysis. Key concepts such as stationarity, invertibility, and the importance of seasonal components are also discussed.

Uploaded by

Debjit Bera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Linear Stationary Processes and White Noise

1. What is a purely random process also known as?


A. Brownian motion
B. Autocorrelation
C. White noise
D. Random walk
Answer: C

2. In a purely random process, the mean is:


A. Non-zero
B. Increasing over time
C. Zero
D. Undefined
Answer: C

3. A purely random process assumes which distribution?


A. Poisson
B. Normal
C. Binomial
D. Exponential
Answer: B

4. A linear stationary process can be expressed as:


A. A deterministic sum
B. A weighted sum of past white noises
C. A sine wave
D. A polynomial equation
Answer: B

5. For stationarity in a linear process, weights must:


A. Be random
B. Decrease rapidly
C. Be constant
D. Follow a normal distribution
Answer: B

Moving Average (MA) Models

6. Moving average (MA) processes are frequently used in:


A. Chemistry
B. Biology
C. Econometrics
D. Geography
Answer: C

7. MA(q) stands for:


A. Multi-approach model
B. Moving average of order q
C. Maximum average
D. None
Answer: B

8. The white noise term in MA(q) is scaled so that:


A. β₀ = 2
B. β₀ = 0
C. β₀ = 1
D. β₀ = -1
Answer: C

9. The autocorrelation function of MA(q) becomes zero when:


A. lag < q
B. lag > q
C. lag = q
D. q = 0
Answer: B

10. The invertibility condition for MA(1) is:


A. |β₁| > 1
B. |β₁| < 1
C. β₁ = 0
D. |β₁| = 1
Answer: B

Weighted Moving Average (WMA)

11. In WMA, recent data points are given:


A. Less weight
B. Equal weight
C. No weight
D. Greater weight
Answer: D

12. In simple moving average, weights are:


A. Increasing
B. Random
C. Equal
D. Zero
Answer: C

13. WMA is most useful when:


A. All data is outdated
B. Future data is known
C. Recent changes are more relevant
D. No variance exists
Answer: C

Autoregressive (AR) Models


14. AR(p) process depends on:
A. Random noise only
B. Past p observations
C. Future values
D. Trends only
Answer: B

15. AR(1) model has how many lag terms?


A. 1
B. 0
C. 2
D. Infinite
Answer: A

16. For AR(p) stationarity, the roots of the characteristic equation must be:
A. Inside unit circle
B. On unit circle
C. Outside unit circle
D. Complex
Answer: C

17. AR(2) process includes:


A. 1 lag term
B. 2 lag terms
C. White noise only
D. No memory
Answer: B

18. Large negative α₁ in AR(1) leads to:


A. Smooth trend
B. Oscillatory zig-zag
C. Constant series
D. No effect
Answer: B

ARMA Models

19. ARMA combines:


A. Only AR
B. Only MA
C. AR and MA
D. Trend and seasonality
Answer: C

20. ARMA is preferred due to:


A. Complexity
B. Parsimony
C. Inaccuracy
D. Irregularity
Answer: B

21. ARMA(p, q) has:


A. p moving average terms
B. q autoregressive terms
C. p AR and q MA terms
D. None of the above
Answer: C

22. ARMA models are stationary and invertible when:


A. All roots lie inside unit circle
B. Roots lie outside unit circle
C. No roots exist
D. Roots are imaginary
Answer: B

ARIMA Models

23. What does “I” in ARIMA stand for?


A. Integration
B. Independence
C. Invertibility
D. Intensity
Answer: A

24. Differencing helps to:


A. Make data non-linear
B. Remove seasonality
C. Stabilize variance
D. Make series stationary
Answer: D

25. ARIMA(p,d,q) becomes ARMA if:


A. d = 1
B. d = 2
C. d = 0
D. d > 0
Answer: C

26. ARIMA(0,1,0) represents:


A. Pure noise
B. Random walk
C. Seasonal trend
D. Invertible model
Answer: B

27. To achieve stationarity in ARIMA:


A. Add noise
B. Take moving average
C. Apply differencing
D. Increase variance
Answer: C

SARIMA Models

28. SARIMA adds which component to ARIMA?


A. Autocovariance
B. Spectral density
C. Seasonality
D. Residuals
Answer: C

29. Seasonal differencing involves:


A. Removing white noise
B. Differencing at seasonal lags
C. Integration of ARMA
D. Removing trend
Answer: B

30. In SARIMA(p,d,q)(P,D,Q)s, “s” stands for:


A. Stationarity
B. Season
C. Shift
D. Standard deviation
Answer: B

31. SARIMA models are ideal for:


A. Non-repetitive trends
B. Non-linear patterns
C. Data with seasonal cycles
D. Gaussian noise
Answer: C

32. SARIMA handles both:


A. Trend and noise
B. Linear and non-linear components
C. Seasonal and non-seasonal behavior
D. MA and AR terms only
Answer: C

Applications & Behavior

33. Autocorrelations in MA(q) drop to zero after:


A. Lag 1
B. Lag q
C. Lag < q
D. Never
Answer: B

34. MA models allow:


A. Dependence on future values
B. Infinite lag
C. Past shock effects
D. Trend prediction
Answer: C

35. AR models allow:


A. Forecasting with no data
B. Only white noise modeling
C. Regressing on previous values
D. Perfect predictions
Answer: C

36. If α₁ = 0.9 in AR(1), the series is:


A. Very volatile
B. Zig-zag
C. Smooth with slow change
D. Random
Answer: C

37. If β₁ = 0.8 in MA(1), the invertibility condition:


A. Is violated
B. Is satisfied
C. Does not apply
D. Is unknown
Answer: B

38. In AR(2), ρ₁ and ρ₂ are calculated from:


A. White noise
B. Variance
C. Coefficients α₁ and α₂
D. Trends
Answer: C

Conceptual

39. Parsimony in modeling means:


A. Overfitting
B. Using minimal parameters
C. Ignoring outliers
D. Removing trends
Answer: B

40. The backward shift operator is used in:


A. Regression
B. ARIMA models
C. MA models
D. WMA
Answer: B

41. ARIMA assumes:


A. No structure
B. Nonlinear dependence
C. Linearity
D. Seasonality
Answer: C

42. A non-invertible process can’t:


A. Predict future
B. Be stationary
C. Be graphed
D. Use residuals
Answer: A

43. Random walk is a special case of:


A. ARIMA
B. MA
C. WMA
D. ARMA
Answer: A

44. Which model best handles monthly sales data with trend and seasonality?
A. ARIMA
B. SARIMA
C. MA
D. AR
Answer: B

45. One difference applied to ARIMA means:


A. d = 1
B. q = 1
C. p = 1
D. Differencing fails
Answer: A

46. ARIMA model is best for:


A. Short-term seasonal forecast
B. Long-term periodic forecast
C. Short-term stationary series
D. Forecasting exact values
Answer: C

Miscellaneous / Higher Order Thinking

47. What happens if roots lie inside unit circle in AR process?


A. Model is invertible
B. Model is stationary
C. Model is non-stationary
D. All are true
Answer: C

48. In MA(2), how many autocorrelations are non-zero?


A. Infinite
B. 0
C. 2
D. Depends on α
Answer: C

49. In AR(1), which of the following affects the smoothness of the series?
A. Mean
B. Variance
C. α₁
D. β₀
Answer: C

50. Which of these statements is true?


A. WMA uses only equal weights
B. ARIMA can handle non-seasonal data only
C. SARIMA is a generalized ARIMA
D. AR(1) = MA(1)
Answer: C

You might also like