Читать книгу Introduction to Linear Regression Analysis - Douglas C. Montgomery - Страница 26

2.2.2 Properties of the Least-Squares Estimators and the Fitted Regression Model

Оглавление

The least-squares estimators and have several important properties. First, note from Eqs. (2.6) and (2.7) that and are linear combinations of the observations yi. For example,


where for i = 1, 2, …, n.

The least-squares estimators and are unbiased estimators of the model parameters β0 and β1. To show this for , consider


since E(εi) = 0 by assumption. Now we can show directly that and , so


That is, if we assume that the model is correct [E(yi) = β0 + β1xi], then is an unbiased estimator of β1. Similarly we may show that of is an unbiased estimator of β0, or


The variance of is found as

(2.13)

because the observations yi are uncorrelated, and so the variance of the sum is just the sum of the variances. The variance of each term in the sum is , and we have assumed that Var(yi) = σ2; consequently,

(2.14)

The variance of is


Now the variance of is just , and the covariance between and can be shown to be zero (see Problem 2.25). Thus,

(2.15)

Another important result concerning the quality of the least-squares estimators and is the Gauss-Markov theorem, which states that for the regression model (2.1) with the assumptions E(ε) = 0, Var(ε) = σ2, and uncorrelated errors, the least-squares estimators are unbiased and have minimum variance when compared with all other unbiased estimators that are linear combinations of the yi. We often say that the least-squares estimators are best linear unbiased estimators, where “best” implies minimum variance. Appendix C.4 proves the Gauss-Markov theorem for the more general multiple linear regression situation, of which simple linear regression is a special case.

There are several other useful properties of the least-squares fit:

1 The sum of the residuals in any regression model that contains an intercept β0 is always zero, that is,This property follows directly from the first normal equation in Eqs. (2.5) and is demonstrated in Table 2.2 for the residuals from Example 2.1. Rounding errors may affect the sum.

2 The sum of the observed values yi equals the sum of the fitted values , orTable 2.2 demonstrates this result for Example 2.1.

3 The least-squares regression line always passes through the centroid [the point ] of the data.

4 The sum of the residuals weighted by the corresponding value of the regressor variable always equals zero, that is,

5 The sum of the residuals weighted by the corresponding fitted value always equals zero, that is,

Introduction to Linear Regression Analysis

Подняться наверх