We start with some important results on stationarity

Weak stationarity is important because weakly stationary processes are much easier to estimate and make calculations on

Theorem: Let be weaky stationary with and . If then is also weakly stationary

  • and
  • .

This is a powerful tool that we will use

MA(q)

Definition: The MA(q) model is defined as

In lag polynomial notation, we write where

We can see that is weakly stationary

if and otherwise

, i.e. the ACF is zero after lags
We can use this to determine the order from a graph of the ACF

AR(p)

Definition: The AR(p) model is defined as

Using lag polynomial notation where

Recall that the stability condition for is
Our condition is a bit more complicated for

Definition: Let follow an process, is called stable if all roots satisfy

has (possibly complex) roots by the Fundamental Theorem of Algebra

Theorem: A stable process is weakly stationary

We can factorize into for suitable values which implies the stability condition holds when

We get the inverse lag polynomial is

We use our lemma from earlier to show if

We apply to to get

Each application of on a weakly stationary process results in a weakly stationary process, so we conclude with our theorem

It can be shown that we can decompose into

This implies for and for

We can derive recursive definitions for auto-covariances,
for

However this still requires solving the initial system of the first auto-covariances, which we call the Yule-Walker equations

We can calculate estimates for if we have sample autocorrelations for

Or we can solve for the auto-correlations given the process parameters

ARMA(p, q)

Theorem: Let be weakly stationary, then we can write where and , is white-noise, for , and is perfectly predictable from its past values (for many models . This is called the Wold decomposition.

We already did this for the process,

Notice that the Wold representation is essentially an MA() model ()
The condition means , meaning a weakly stationary process can be approximated by an MA(q) process!

The ARMA model is motivated as a way to reduce necessary for a good approximation

Definition: The autoregressive moving average model of order is defined as with

Using lag polynomials (with ), , and

This model is weakly stationary if the portion is stable, in which case

The Wold Decomposition is with and

Getting the actual coefficients is a bit more involved (but we’d start by looking at )

In other words, a stable model (also known as causal) can be approximated by an model

Can we also approximate it with an model? Let’s examine


If , then we call this process invertible (by the same logic we’ve used previously),


Yes!

Extending to the general case, the polynomial is invertible if all roots satisfy

So invertibility implies that the can be approximated by an model

Invertibility also implies we can determine the coefficients that made up the time series at time