Читать книгу Quantitative Finance For Dummies - Steve Bell - Страница 5

Part 1
Getting Started with Quantitative Finance
Chapter 2
Understanding Probability and Statistics

Оглавление

IN THIS CHAPTER

Comprehending that events can be random

Gathering data to produce statistics of random variables

Defining some important distributions

If you’ve ever placed a bet on a horse or wondered whether your date for the evening is going to turn up, then you know a bit about probability and statistics. The concepts get more interesting if you have multiple events or events in succession.

For example, if you manage to pick both the first and second place horses in a race (an exacta) does that mean you have real skill? This common bet offered by bookies is almost as creative as some of the speculative products offered by bankers.

In this chapter, I start with a few ideas about probability and continue by showing you how they apply to statistical distributions. I examine applications of probability, starting with dice games.

I then look at what happens when you have many random events and a distribution of outcomes.

One distribution of special importance is the Gaussian distribution. It keeps on appearing, and I introduce you to its key properties. I also introduce you to the law of large numbers, which is a more mathematical way of looking at the outcome of a large number of random events.

Probability boils down to a number that refers to a specific situation or event. Statistics, on the other hand, is a way of reasoning from large amounts of data back to some general conclusion – a tool for dealing with data. The later sections of this chapter take you through some widely used results that help in understanding data sets.

The situations I present in this chapter come to look like financial markets, where day-by-day or even millisecond-by-millisecond prices are changing in a highly volatile fashion. So, this chapter gives you a taste of some of the key quantitative tools for understanding how modern financial markets work.

Figuring Probability by Flipping a Coin

Humans have a deep fascination for outcomes that are not certain. That may be because humans learned early that outcomes in many situations are indeed uncertain. Dice games are the most common method used to examine probability, which is the chance of an event taking place or a statement being true. Dice games have engaged the interest of many famous mathematicians, and because the games are played for money, studying them can be considered the birth of quantitative finance.

Archaeological evidence shows that games of chance have been played for at least the past 34 centuries. Later (well, much later in fact, only several hundred years ago) mathematicians tried to understand the results of these games of chance and that is what led to what is now called probability theory, the mathematical study of randomness.

Probability is the mathematician’s way of analysing random events. To define random isn’t so easy and part of what makes the study of randomness important. The rising of the sun tomorrow isn’t a random event but what mathematicians (and almost everyone else) define as certain. Every certain event has a probability of one. An impossible event (such as having hot sunshine every day throughout the summer in England) has a probability of zero. However, whether it will be raining tomorrow or not is a random event with a probability somewhere between one and zero. That doesn’t mean you have no knowledge of whether it will rain, just that even if you have looked at the most reliable forecasts, you still cannot be certain one way or the other. Equally, the flip of a coin is a random event and so is the throw of a die. The outcomes cannot be predicted, at least if the coin or die isn’t loaded in some way.

Philosophers and mathematicians (for example, the French mathematician Laplace) have thought deeply about near-certain events such as the rising of the sun tomorrow. There’s no day on record when the sun didn’t rise, and the probability of the sun rising tomorrow is very, very close to 1, but that isn’t proof that it will continue to rise every day. I’m not trying to be apocalyptic; I’m just using facts to come to conclusions.

It’s good to be wary of statements about the certainty of something happening or not happening. That can be especially true in finance where it’s easy to take some things for granted. Governments and banks can go bankrupt and stock markets do crash; the probability is very small but not zero.

Mathematicians tend to evaluate the probability of a symmetrical coin turning up heads using their usual logic. It seems reasonable to assume that the likelihood of the coin turning up heads is the same as that of its turning up tails.

The probability of tossing a head, written as P(H), and the probability of tossing a tail, P(T), is P(H) + P(T) = 1. This is because the only result from tossing a coin is either heads or tails. (If you can flip a coin and make it land on its edge, please ask the publisher of this book for a refund!) Therefore, simple arithmetic shows that the probability of tossing a head is 1/2. However, you can never be sure that the coin is, in fact, symmetric. The experiment of flipping the coin can give you the estimate of the probability. Flip the coin lots and lots of times and count how many times it turns up heads. Divide this by the number of throws, and you get the estimate of the probability of getting heads. Because the number of throws is greater than the number of heads, the estimate is a number between zero and one. This number will converge to 0.5 as long as the number of flips keeps increasing. This is an empirical fact!

To apply mathematics to the real world, you must make a few assumptions, so you need to be clear what these assumptions are. In this case, ignoring the possibility of a coin landing on its edge is a good one.

The sum of the probability of all possible events is one as you can be certain that one of the possibilities will happen.

A similar analysis can be applied to a fair die: There’s a 1/6 probability of throwing any one of the six faces. Here again you have to put on your mathematician’s hat. If the die is fair, the likelihoods of getting 1, 2, 3, 4, 5, or 6 are the same. So, P(i) = 1/6 where i can be any integer from one to six. Now, adding up the probabilities of each way the die can land:

.

In common practice, I use the capital Greek letter sigma, Σ, to indicate a summation. This formula follows from the fact that the sum of the probabilities of all possible outcomes must be one.

You can also calculate the probability of a number of events. For example, if tossing a coin and turning up heads is P(H) = 0.5, then the probability of tossing two heads in succession is P(H)P(H) = 0.5 × 0.5 = 0.25. Because landing a head on the first toss is independent of whether you land a head on the second toss, you must multiply the probability of the individual events to get the probability of the joint event of getting two heads. Independence is an important concept in quantitative finance. You can frequently assume that the return of a financial asset on a given day is independent of the return on the previous day.

The most common definition for the return, rn, of a financial asset on day, n, in terms of the price, pn, on day n is:

.

Likewise, the probability of tossing two tails in succession is P(T)P(T) = 0.5 × 0.5 = 0.25. You need to take care in figuring the probability of tossing a head and a tail. Either the head or the tail can be tossed first, so you have two ways of getting a head and a tail. In the case of tossing a head first P(H)P(T) = 0.5 × 0.5 = 0.25 and a tail first P(T)P(H) = 0.5 × 0.5 = 0.25. Adding up these two probabilities P(T)P(H) + P(H)P(T) = 0.5 gives you the probability of a head and a tail, irrespective of which face came up first.

Applying these ideas to a die, you can calculate the probability of rolling either a three or a four, for example. To do this, you add the probabilities because the events of rolling a three or a four are disjoint, meaning that they’re completely different and can’t happen together. So the probability of rolling either a three or a four is 1/6 + 1/6 = 1/3.

When I calculated the probability of both heads and tails in two tosses, I came up with the number 0.5. I got this answer using the idea of disjoint events – the event of tossing a head first and then a tail is disjoint from first tossing a tail and then a head. So you must add the probabilities of these events to get the overall probability of getting a head and a tail in two tosses.

To make this clear in another way, use the example of a deck of cards. The probability of drawing a king or a spade isn’t simply P(King)+P(Spade) because, of course, you can draw the king of spades. So the events of drawing a king or a spade are not disjoint and you need to take into account the probability of drawing the king of spades.

Probability can be summarised in a few short statements:

❯❯ A probability is a number between zero and one.

❯❯ For a certain event, the probability is exactly one.

❯❯ For disjoint events, the probability of at least one event happening is the sum of the individual probabilities for the events.

❯❯ For independent events, the probability of both of them happening is the product of the individual probabilities for the events.

You may find it amazing, but that’s all you really need to know about probability.

Playing a game

Now that you’re familiar with coin flipping, I’d like to challenge you to a game. I’ll flip a coin again and again until it turns up heads. If it turns up heads on the first flip, I’ll give you £2. If it turns up heads for the first time on the second flip, I’ll give you £4. If it turns up heads for the first time on the third flip, I’ll give you £23 = £8. And if it turns up heads on the nth flip I’ll give you £2n. How much are you prepared to pay me to play this game?

To work out how much you may win, you need to calculate some probabilities. The probability of the coin turning up heads is always 0.5, and the probability the coin turns up tails is also 0.5. If heads appears first on the nth (say, third) flip, then all previous flips must have been tails. The probability of that is 1/2(n-1) (so 0.52 if n = 3). You must now multiply again by 0.5 to get the probability of heads on the nth flip preceded by tails on the previous (n–1) flips. This works out as 1/2n (so 0.5 × 0.52 = 0.53 if n = 3).

So, heads turn up for the first time on the nth flip with probability 1/2n. If heads turns up first on the nth flip, then you win £2n. The total expected pay-off (the amount, on average, you receive for winning) is then:

£2/2 + £22/22 + £23/23 + …

However, this is just a series of 1s going on forever that adds up to infinity. So, then, would you pay me your life savings to play this game in the hope of a staggering return? If heads came up first, you may be disappointed at receiving a mere £2 for your savings; but if you had to wait a long time for heads to turn up but eventually it did and you were due a substantial pay off, I may not be able to pay out your winnings. I don’t think that the Central Bank would print large amounts of money to help me out. This is an extreme example in which an unlikely event plays a significant role. You may notice a spooky similarity to certain recent events in financial markets even though this game was invented several hundred years ago.

Flipping more coins

Another fun experiment with a coin is to keep on flipping it again and again to see how many times heads comes up. Sometimes heads follows tails and at other times there can be long series of either heads or tails.

During long sequences of heads or tails, you can easily believe that you have a higher than average probability of the other side turning up to even things up a bit. This gambler’s fallacy, however, isn’t valid. The coin has no memory. On each flip, the probability of heads remains 0.5, as does the probability for tails.

An important idea that comes out of experimenting with gambling games is the Law of Large Numbers. It states that the average result from a large number of trials (such as coin tossing) should be close to the expected value (0.5 for tossing heads) and will become closer as more trials are performed. I’ll show you how this works.

If Hn is the total number of heads (for example, 4) in the first n tosses (for example, 8) then Hn/n should tend towards 0.5 (so, 4/8 = 0.5). Figure 2-1 graphs 1,000 coin tosses.


© John Wiley & Sons, Ltd.

FIGURE 2-1: Convergence of the proportion of tossed coins landing heads up.


The chart fluctuates less and less after more coin flips and the fraction of heads converges (gets closer and closer) towards 0.5. This is an example of the Law of Large Numbers. You’d be surprised though at how many tosses it takes for the chart to settle down to the expected average.

I examine this further by plotting Hn – n/2 where n/2 is the expected number of heads after n tosses. The line in Figure 2-2 wanders about and shows that convergence isn’t good. It’s disconcerting that although the fraction of heads tossed tends towards 0.5 in relative terms, in absolute terms, the number of heads can wander further and further away from the expected value of n/2. You may have guessed already that this unstable sequence, called a random walk, can be used as a model for how share prices change with time.


Конец ознакомительного фрагмента. Купить книгу
Quantitative Finance For Dummies

Подняться наверх