**Have a Question?**

**Phone:** +1 (888) 427-9486

+1 (312) 257-3777

Contact Us

# Technical Note: Autoregressive Model

Attachment | Size | |
---|---|---|

TN 2 - AR.pdf |

We originally composed these technical notes after sitting in on a time series analysis class. Over the years, we’ve maintained these notes and added new insights, empirical observations and intuitions acquired. We often go back to these notes for resolving development issues and/or to properly address a product support matter.

In this paper, we’ll go over another simple, yet fundamental, econometric model: the auto-regressive model. Make sure you have looked over our prior paper on the moving average model, as we build on many of the concepts presented in that paper.

This model serves as a cornerstone for any serious application of ARMA^{i}/ARIMA^{i} models.

## Background

The auto-regressive model of order p (i.e. AR(p)) is defined as follows:

Where

- is the innovations or shocks for our process
- is the conditional standard deviation (aka volatility)

Essentially, the AR(p) is merely a multiple linear regression model where the independent (explanatory) variables are the lagged editions of the output (i.e. ). Keep in mind that may be highly correlated with each other.

## Why do we need another model?

First, we can think of an AR model as a special (i.e. restricted) representation of a process. Let’s consider the following stationary AR (1) process:

Now, by subtracting the long-run mean from the response variable (), the process now has zero long-run (unconditional/marginal) mean.

Next, the process can be further simplified as follows:

For a stationary process, the

In sum, using the AR(1) model, we are able to represent this model using a smaller storage requirement.

We can generalize the procedure for a stationary AR(p) model, and assuming an representation exists, the MA coefficients’ values are solely determined by the AR coefficient values:

Once again, by design, the long-run mean of the revised model is zero.

Hence, the process can be represented as follows:

By having , we can use the partial-fraction decomposition and the geometric series representation; we then construct the algebraic equivalent of the representation.

Hint: By now, this formulation looks enough like what we have done earlier in the MA technical note, since we inverted a finite order MA process into an equivalent representation of .

The key point is being able to convert a stationary, finite-order AR process into an algebraically equivalent representation. This property is referred to as **causality.**

## Causality

Definition: A linear process is causal (strictly, a causal function of ) if there is an equivalent representation.

Where

Causality is a property of both and .

In plain words, the value of is solely dependent on the past values of .

**IMPORTANT:** An AR(p) process is **causal** (with respect to ) if and only if the characteristics roots (i.e. ) fall outside the unit circle (i.e. ).

Let’s consider the following example:

Now, let’s re-organize the terms in this model:

Convert the new AR process into an MA

The process above is non-causal, as its values depend on future values of observations. However, it is also stationary.

Going forward, for an AR (and ARMA) process, stationarity is not sufficient by itself; the process must be causal as well. For all our future discussions and application, we shall only consider stationary causal processes.

## Stability

Similar to what we did in the moving average model paper, we will now examine the long-run marginal (unconditional) mean and variance.

(1) Let’s assume the long-run mean () exists, and:

Now, subtract the long-run mean from all output variables:

Take the expectation from both sides:

In sum, for the long-run mean to exist, the sum of values of the AR coefficients can’t be equal to one.

(2) To examine the long-run variance of an AR process, we’ll use the equivalent representation and examine its long-run variance.

Using partial-fraction decomposition:

For a stable MA process, all characteristics roots (i.e. ) must fall outside the unit circle (i.e. ):

**Note:** It can be easily shown that:

Next, let’s examine the convergence property of the MA representation:

Finally, the long-run variance of an infinite MA process exists if the sum of its squared coefficients is finite.

Furthermore, for the AR(p) process to be causal, the sum of absolute coefficient values is finite as well.

### Example: AR(1)

Assuming all characteristic roots () fall outside the unit circle, the AR(p) process can be viewed as a weighted sum of p-stable MA processes, so a finite long-run variance must exit.

## Impulse Response Function

Earlier, we used AR(p) characteristics roots and partial-fraction decomposition to derive the equivalent of an infinite order moving average representation. Alternatively, we can compute the impulse response function (IRF) and find the MA coefficients’ values.

The impulse response function describes the model output triggered by a single shock at time t.

The procedure above is relatively simple (computationally) to perform, and can be carried on for any arbitrary order (i.e. k).

**Note:** Recall the partial fraction decomposition we did earlier:

We derived the values for the MA coefficients as follows:

In principle, the IRF values must match the MA coefficients values. So we can conclude:

- The sum of denominators (i.e. ) of the partial-fractions equals to one (i.e. ).
- The weighted sum of the characteristics roots equals to (i.e. ).
- The weighted sum of the squared characteristics roots equals to (i.e. ).

## Forecasting

Given an input data sample , we can calculate values of the moving average process for future (i.e. out-of-sample) values as follows:

We can carry this calculation to any number of steps we wish.

Next, for the forecast error:

## References

- Hamilton, J .D.; Time Series Analysis , Princeton University Press (1994), ISBN 0-691-04289-6
- D. S.G. Pollock ; Handbook of Time Series Analysis, Signal Processing, and Dynamics , Academic Press (1999), ISBN: 0125609906
- Box, Jenkins and Reisel ; Time Series Analysis: Forecasting and Control, John Wiley & SONS. (2008) 4th edition, ISBN:0470272848

Attachment | Size |
---|---|

TN 2 - AR.pdf | 263.22 KB |