Читать книгу Statistics and Probability with Applications for Engineers and Scientists Using MINITAB, R and JMP - Bhisham C. Gupta, Irwin Guttman - Страница 117

3.3 Concepts of Probability

Оглавление

Suppose that a sample space S, consists of a finite number, say m, of elements , so that the elements are such that for all and also represent an exhaustive list of outcomes in S, so that . If the operation whose sample space is S is repeated a large number of times, some of these repetitions will result in , some in , and so on. (The separate repetitions are often called trials.) Let be the fractions of the total number of trials resulting in , respectively. Then, are all nonnegative, and their sum is 1. We may think of as observed weights or measures of occurrence of obtained on the basis of an experiment consisting of a large number of repeated trials. If the entire experiment is repeated, another set of f's would occur with slightly different values, and so on for further repetitions. If we think of indefinitely many repetitions, we can conceive of idealized values being obtained for the f's. It is impossible, of course, to show that in a physical experiment, the f's converge to limiting values, in a strict mathematical sense, as the number of trials increases indefinitely. So we postulate values corresponding to the idealized values of , respectively, for an indefinitely large number of trials. It is assumed that are all positive numbers and that

(3.3.1)

The quantities are called probabilities of occurrence of , respectively.

Now suppose that E is any event in S that consists of a set of one or more e's, say . Thus . The probability of the occurrence of E is denoted by and is defined as follows:


If E contains only one element, say , it is written as


It is evident, probabilities of events in a finite sample space S are values of an additive set function defined on sets E in S, satisfying the following conditions:

1 If E is any event in S, then(3.3.2a)

2 If E is the sample space S itself, then(3.3.2b)

3 If E and F are two disjoint events in S, then(3.3.2c)

These conditions are also sometimes known as axioms of probability. In the case of an infinite sample space S, condition 3 extends as follows:

if is an infinite sequence of disjoint events, then

(3.3.2d)

As and are disjoint events, then from condition 3, we obtain

(3.3.3)

But since and , we have the following:

Theorem 3.3.1 (Rule of complementation) If E is an event in a sample space S, then

(3.3.4)

The law of complementation provides a simple method of finding the probability of an event , if E is an event whose probability is easy to find. We sometimes say that the odds in favor of E are

(3.3.4a)

which from 3.3.4 takes the form . The reader may note that .

Example 3.3.1 (Tossing coins) Suppose that 10 coins are tossed and we ask for the probability of getting at least 1 head. In this example, the sample space S has sample points. If the coins are unbiased, the sample points are equally likely (sample points are called equally likely if each sample point has the same probability of occurring), so that to each of the sample points the probability is assigned. If we denote by E the event of getting no heads, then E contains only one sample point, and , of course, has 1023 sample points. Thus


The odds on E and are clearly and .

Referring to the statement in Theorem 3.3.1 that and are disjoint events whose union is S, we have the following rule.

Theorem 3.3.2 (General rule of complementation) If are events in a sample space S, then we have

(3.3.5)

Another useful result follows readily from (3.3.2c) by mathematical induction

Theorem 3.3.3 (Rule of addition of probabilities for mutually exclusive events) If are disjoint events in a sample space S, then

(3.3.6)

Example 3.3.2 (Determination of probabilities of some events) Suppose that a nickel and a dime are tossed, with H and T denoting head and tail for the nickel and h and t denoting head and tail for the dime. The sample space S consists of the four elements Hh, Ht, Th, and Tt. If these four elements are all assigned equal probabilities and if E is the event of getting exactly one head, then , and we have that


Now suppose that and are arbitrary events in S. Then from Figure 3.2.2, with and , it can be easily seen that , are three disjoint events whose union is . That is,

(3.3.7)

Also, and are disjoint sets whose union is . Hence,

(3.3.8)

Similarly

(3.3.9)

Solving (3.3.8) for and (3.3.9) for and substituting in (3.3.7), we obtain the following.

Theorem 3.3.4 (Rule for addition of probabilities for two arbitrary events) If and are any two events in a sample space S, then

(3.3.10)

The rule for three events is given by

(3.3.11)

More generally, for n events , we have,

(3.3.12)

Note that for , if and are disjoint, and (3.3.10) reduces to (3.3.6); that is,


Similarly, if , , and are disjoint, (3.3.11) reduces to


Statistics and Probability with Applications for Engineers and Scientists Using MINITAB, R and JMP

Подняться наверх