# 1.2 A first crash course

Let us introduce some of the most important concepts of time series analysis by considering an example where we look for simple models for predicting the monthly prices of wheat.
In the following, let Pt denote the price of wheat at time (month) t. The first naive guess would be to say that the price next month is the same as in this month. Hence, the predictor is t+1|t = Pt.(1.1)
This predictor is called the naive predictor or the persistent predictor. The syntax used is short for a prediction (or estimate) of Pt+1 given the observations Pt, Pt-1 , . . ..
Next month, i.e., at time t + 1, the actual price is Pt+1. This means that the prediction error or innovation may be computed as εt+1 = Pt+1 - P̂t+1|t.(1.2)
By combining Equations (1.1) and (1.2) we obtain the stochastic model for the wheat price Pt = Pt-1 + εt(1.3)
If {εt} is a sequence of uncorrelated zero mean random variables (white noise), the process (1.3) is called a random walk. The random walk model is very often seen in finance and econometrics. For this model the optimal predictor is the naive predictor (1.1).
The random walk can be rewritten as Pt = εt + εt-1 + · · ·(1.4) which shows that the random walk is an integration of the noise, and that the variance of Pt is unbounded; therefore, no stationary distribution exists. This is an example of a non-stationary process.
However, it is obvious to try to consider the more general model Pt = φPt-1 + εt(1.5) called the AR(1) model (the autoregressive first order model). For this process a stationary distribution exists for |φ| < 1. Notice that the random walk is obtained for φ = 1.
Another candidate for a model for wheat prices is Pt = ψPt-12 + εt(1.6) which assumes that the price this month is explained by the price in the same month last year. This seems to be a reasonable guess for a simple model, since it is well known that wheat price exhibits a seasonal variation. (The noise processes in (1.5) and (1.6) are, despite the notation used, of course, not the same).
For wheat prices it is obvious that both the actual price and the price in the same month in the previous year might be used in a description of the expected price next month. Such a model is obtained if we assume that the innovation εt in model (1.5) shows an annual variation, i.e., the combined model is (Pt - φPt-1 ) - ψ(Pt-12 - φPt-13 ) = εt.(1.7) Models such as (1.6) and (1.7) are called seasonal models, and they are used very often in econometrics.
Notice, that for ψ = 0 we obtain the AR(1) model (1.5), while for φ = 0 the most simple seasonal model in (1.6) is obtained.
By introducing the backward shift operator B by BkPt = Pt-k(1.8) the models can be written in a more compact form. The AR(1) model can be written as (1 - φB)Pt = εt , and the seasonal model in (1.7) as (1 - φB)(1 - ψB12 )Pt = εt(1.9) If we furthermore introduce the difference operator ∇= (1 - B)(1.10) then the random walk can be written ∇Pt = εt using a very compact notation. In this book these kinds of notations will be widely used in order to obtain compact equations.
Given a time series of observed monthly wheat prices, P1 , P2 , . . . , PN , the model structure can be identified, and, for a given model, the time series can be used for parameter estimation.
The model identification is most often based on the estimated autocorrelation function, since, as it will be shown in Chapter 6, the autocorrelation function fulfils the same difference equation as the model. The autocorrelation function shows how the price is correlated to previous prices; more specifically the autocorrelation in lag k, called ρ(k), is simply the correlation between Pt and Pt-k for stationary processes. For the monthly values of the wheat price we might expect a dominant annual variation and, hence, that the autocorrelation in lag 12, i.e., ρ(12) is high.
The models above will, of course, be generalized in the book. It is important to notice that these processes all belong to the more general class of linear processes, which again is strongly related to the theory of linear systems as demonstrated in the book.