Читать книгу The Investment Advisor Body of Knowledge + Test Bank - IMCA - Страница 39

CHAPTER 3
Statistics and Methods
Part VII Time Series Models

Оглавление

Time series describe how random variables evolve over time and form the basis of many financial models.

Random Walks

A time series is an equation or set of equations describing how a random variable or variables evolves over time. Probably the most basic time series is the random walk. For a random variable X, with a realization xt at time t, the following conditions describe a random walk:

(3.99)

In other words, X is equal to its value from the previous period, plus a random disturbance, is mean zero, with a constant variance. The last assumption, combined with the fact that ϵt is mean zero, tells us that the ϵ's from different periods will be uncorrelated with each other. In time series analysis, we typically refer to xt– 1 as the first lagged value of xt, or just the first lag of xt. By this convention, xt– 2 would be the second lag, xt– 3 the third, and so on.

We can also think in terms of changes in X. Subtracting xt– 1 from both sides of our initial equation:

(3.100)

In this basic random walk, has all of the properties of our stochastic term, ϵt. Both are mean zero. Both have a constant variance, σ2. Most importantly, the error terms are uncorrelated with each other. This system is not affected by its past. This is the defining feature of a random walk.

How does the system evolve over time? Note that Equation 3.99 is true for all time periods. All of the following equations are valid:

(3.101)

By substituting the equation into itself, we can see how the equation evolves over multiple periods:

(3.102)

At time t, X is simply the sum of its initial value, x0, plus a series of random steps. Using this formula, it is easy to calculate the conditional mean and variance of xt:

(3.103)

If the variance increases proportionally with t, then the standard deviation increases with the square root of t. This is our familiar square root rule for independent and identically distributed (i.i.d.) variables. For a random walk, our best guess for the future value of the variable is simply the current value, but the probability of finding it near the current value becomes increasingly small.

Though the proof is omitted here, it is not difficult to show that, for a random walk, skewness is proportional to t−0.5 and kurtosis is proportional to t−1. In other words, while the mean, variance, and standard deviation increase over longer time spans, skewness and kurtosis become smaller.

The simple random walk is not a great model for equities, where we expect prices to increase over time, or for interest rates, which cannot be negative. With some rather trivial modification, though, we can accommodate both of these requirements.

Variance and Autocorrelation

Autocorrelation has a very important impact on variance as we look at longer and longer time periods. For our random walk, as we look at longer and longer periods, the variance grows in proportion to the length of time.

Assume returns follow a random walk:

(3.104)

where ϵt is an i.i.d. disturbance term. Now define yn,t as an n period return; that is:

(3.105)

As stated before, the variance of yn,t is proportional to n:

(3.106)

and the standard deviation of yn,t is proportional to the square root of n. In other words, if the daily standard deviation of an equity index is 1 percent and the returns of the index follow a random walk, then the standard deviation of 25-day returns will be 5 percent, and the standard deviation of 100-day returns will be 10 percent.

When we introduce autocorrelation, this square root rule no longer holds. If instead of a random walk we start with an AR(1) series:

(3.107)

Now define a two-period return:

(3.108)

With just two periods, the introduction of autocorrelation has already made the description of our multiperiod return noticeably more complicated. The variance of this series is now:

(3.109)

If λ is zero, then our time series is equivalent to a random walk and our new variance formula gives the correct answer: that the variance is still proportional to the length of our multiperiod return. If λ is greater than zero, and serial correlation is positive, then the two-period variance will be more than twice as great as the single-period variance. If λ is less than zero, and the serial correlation is negative, then the two-period variance will be less than twice the single-period variance. This makes sense. For series with negative serial correlation, a large positive return will tend to be followed by a negative return, pulling the series back toward its mean, thereby reducing the multiperiod volatility. The opposite is true for series with positive serial correlation.

Time series with slightly positive or negative serial correlation abound in finance. It is a common mistake to assume that variance is linear in time, when in fact it is not. Assuming no serial correlation when it does exist can lead to a serious overestimation or underestimation of risk.

The Investment Advisor Body of Knowledge + Test Bank

Подняться наверх