Читать книгу Introduction to Linear Regression Analysis - Douglas C. Montgomery - Страница 23

2.2.1 Estimation of β0 and β1

Оглавление

The method of least squares is used to estimate β0 and β1. That is, we estimate β0 and β1 so that the sum of the squares of the differences between the observations yi and the straight line is a minimum. From Eq. (2.1) we may write

(2.3)

Equation (2.1) maybe viewed as a population regression model while Eq. (2.3) is a sample regression model, written in terms of the n pairs of data (yi, xi) (i = 1, 2, …, n). Thus, the least-squares criterion is

(2.4)

The least-squares estimators of β0 and β1, say and , must satisfy


and


Simplifying these two equations yields

(2.5)

Equations (2.5) are called the least-squares normal equations. The solution to the normal equations is

(2.6)

and

(2.7)

where


are the averages of yi and xi, respectively. Therefore, and in Eqs. (2.6) and (2.7) are the least-squares estimators of the intercept and slope, respectively. The fitted simple linear regression model is then

(2.8)

Equation (2.8) gives a point estimate of the mean of y for a particular x.

Since the denominator of Eq. (2.7) is the corrected sum of squares of the xi and the numerator is the corrected sum of cross products of xi and yi, we may write these quantities in a more compact notation as

(2.9)

and

(2.10)

Thus, a convenient way to write Eq. (2.7) is

(2.11)

The difference between the observed value yi and the corresponding fitted value is a residual. Mathematically the ith residual is

(2.12)

Residuals play an important role in investigating model adequacy and in detecting departures from the underlying assumptions. This topic is discussed in subsequent chapters.

Introduction to Linear Regression Analysis

Подняться наверх