Last modified: September 16, 2024

This article is written in: πŸ‡ΊπŸ‡Έ

Autoregressive (AR) Models in Time Series Analysis

Autoregressive (AR) models are fundamental tools in time series analysis, used to describe and forecast time-dependent data. An AR model predicts future values based on a linear combination of past observations. The order of an AR model, denoted as ( p ), indicates how many lagged past values are used.

Mathematical Definition of AR Models

An Autoregressive model of order ( p ), denoted as AR(( p )), is defined by the equation:

[ X_t = c + \sum_{i=1}^{p} \phi_i X_{t-i} + \epsilon_t ]

where:

Properties of AR Models

Estimation of Parameters

Parameters ( c ) and ( \phi_i ) can be estimated using methods such as:

Model Selection

Selecting the appropriate order ( p ) is crucial. Common criteria include:

[ \text{AIC} = -2 \ln(L) + 2k ]

[ \text{BIC} = -2 \ln(L) + k \ln(n) ]

where:

Lower values of AIC or BIC indicate a better model, balancing goodness of fit and model complexity.


Example: AR(2) Model in Time Series Forecasting

Consider an AR model of order 2, denoted as AR(2). This model assumes that the value at a particular time point is a linear function of the two preceding values, plus some error term.

AR(2) Model Equation

The AR(2) model is given by:

[ X_t = c + \phi_1 X_{t-1} + \phi_2 X_{t-2} + \epsilon_t ]

Suppose we have:

Calculation

Plugging the values into the AR(2) equation:

[ \begin{align} X_t &= 3 + 0.6 \times 10 - 0.2 \times 5 + \epsilon_t \ &= 3 + 6 - 1 + \epsilon_t \ &= 8 + \epsilon_t \end{align} ]

Interpretation

Visualization

ar2_model

This visualization illustrates how the AR(2) model captures the underlying pattern of the time series. The model uses the two most recent values to make predictions, adjusting to the trends and fluctuations in the data.

Autoregressive Moving Average (ARMA) Models

ARMA models combine autoregressive (AR) and moving average (MA) components to model time series data exhibiting both autocorrelation and serial dependence.

Mathematical Definition of ARMA Models

An ARMA(( p, q )) model is defined by:

[ X_t = c + \sum_{i=1}^{p} \phi_i X_{t-i} + \epsilon_t + \sum_{j=1}^{q} \theta_j \epsilon_{t-j} ]

or, equivalently, using the backshift operator ( B ):

[ \phi(B) X_t = c + \theta(B) \epsilon_t ]

where:

Key Concepts

Stationarity of AR Processes

An AR(( p )) process is stationary if all the roots of the characteristic polynomial ( \phi(B) = 0 ) lie outside the unit circle in the complex plane. This condition ensures that the time series has a constant mean and variance over time.

Invertibility of MA Processes

An MA(( q )) process is invertible if all the roots of ( \theta(B) = 0 ) lie outside the unit circle. Invertibility allows the MA process to be expressed as an infinite AR process, ensuring a unique representation and facilitating parameter estimation.

Infinite Order Representations

An MA process can be expressed as an infinite-order AR process:

[ X_t = \sum_{k=1}^{\infty} \pi_k X_{t-k} + \epsilon_t ]

An AR process can be expressed as an infinite-order MA process:

[ X_t = \sum_{k=0}^{\infty} \psi_k \epsilon_{t-k} ]

Example: ARMA(1,1) Process

Consider the ARMA(1,1) model:

[ X_t = \phi X_{t-1} + \epsilon_t + \theta \epsilon_{t-1} ]

Let ( \phi = 0.7 ), ( \theta = 0.2 ), and ( \epsilon_t ) is white noise.

Simulation

To analyze this process, we simulate a large number of observations using statistical software (e.g., R or Python) to approximate its properties.

set.seed(500)
data <- arima.sim(n = 1e6, list(ar = 0.7, ma = 0.2))

Converting ARMA to Infinite Order Processes

[ \begin{align} (1 - \phi B) X_t &= (1 + \theta B) \epsilon_t \ X_t &= (1 - \phi B)^{-1} (1 + \theta B) \epsilon_t \ X_t &= [1 + \phi B + \phi^2 B^2 + \dots] (1 + \theta B) \epsilon_t \ \end{align} ]

Multiplying the series:

[ X_t = [1 + (\phi + \theta) B + (\phi^2 + \phi \theta) B^2 + \dots] \epsilon_t ]

[ X_t = \frac{1 + \theta B}{1 - \phi B} \epsilon_t = [1 + \psi_1 B + \psi_2 B^2 + \dots] \epsilon_t ]

Calculating ( \psi ) coefficients:

[ \psi_k = \phi^k + \theta \phi^{k-1} ]

Theoretical Autocorrelations

The autocorrelation function (ACF) for an ARMA(1,1) process is:

[ \rho_k = \phi^k \left( \frac{1 + \phi \theta}{1 + 2 \phi \theta + \theta^2} \right) ]

Calculations:

[ \begin{align} \rho_1 &= 0.7 \left( \frac{1 + 0.7 \times 0.2}{1 + 2 \times 0.7 \times 0.2 + 0.2^2} \right) \approx 0.777 \ \rho_2 &= 0.7 \times \rho_1 \approx 0.544 \ \rho_3 &= 0.7 \times \rho_2 \approx 0.381 \ \end{align} ]

Results and Interpretation


Autoregressive Integrated Moving Average (ARIMA) Models

ARIMA models generalize ARMA models to include differencing, allowing them to model non-stationary time series data.

Mathematical Definition of ARIMA Models

An ARIMA(( p, d, q )) model is defined by:

[ \phi(B) (1 - B)^d X_t = c + \theta(B) \epsilon_t ]

where:

Determining Differencing Order

Fitting ARIMA Models: Numerical Example

Suppose we have a time series ( X_t ) exhibiting an upward trend.

Step 1: Differencing

First-order differencing is applied to achieve stationarity:

[ Y_t = (1 - B) X_t = X_t - X_{t-1} ]

Step 2: Model Identification

Analyzing the differenced series ( Y_t ):

Assume ACF suggests MA(1) and PACF suggests AR(1).

Step 3: Parameter Estimation

Fit an ARIMA(1,1,1) model:

[ (1 - \phi B)(1 - B) X_t = c + (1 + \theta B) \epsilon_t ]

Estimate ( \phi ), ( \theta ), and ( c ) using MLE.

Step 4: Model Diagnostics

Step 5: Forecasting

Use the fitted model to forecast future values:

[ \hat{X}{t+h} = c + \phi \hat{X}{t+h-1} + \theta \hat{\epsilon}_{t+h-1} ]

Limitations of AR and ARIMA Models

Table of Contents

  1. Autoregressive (AR) Models in Time Series Analysis
    1. Mathematical Definition of AR Models
    2. Properties of AR Models
    3. Estimation of Parameters
    4. Model Selection
  2. Example: AR(2) Model in Time Series Forecasting
    1. AR(2) Model Equation
      1. Calculation
      2. Interpretation
    2. Visualization
  3. Autoregressive Moving Average (ARMA) Models
    1. Mathematical Definition of ARMA Models
    2. Key Concepts
      1. Stationarity of AR Processes
      2. Invertibility of MA Processes
      3. Infinite Order Representations
    3. Example: ARMA(1,1) Process
      1. Simulation
      2. Converting ARMA to Infinite Order Processes
      3. Theoretical Autocorrelations
      4. Results and Interpretation
  4. Autoregressive Integrated Moving Average (ARIMA) Models
    1. Mathematical Definition of ARIMA Models
    2. Determining Differencing Order
    3. Fitting ARIMA Models: Numerical Example
      1. Step 1: Differencing
      2. Step 2: Model Identification
      3. Step 3: Parameter Estimation
      4. Step 4: Model Diagnostics
      5. Step 5: Forecasting
  5. Limitations of AR and ARIMA Models