Читать книгу Applied Univariate, Bivariate, and Multivariate Statistics - Daniel J. Denis - Страница 53

2.15 AKAIKE'S INFORMATION CRITERIA

Оглавление

A measure of model fit commonly used in comparing models that uses the log‐likelihood is Akaike's information criteria, or AIC (Sakamoto, Ishiguro, and Kitagawa, 1986). This is one statistic of the kind generally referred to as penalized likelihood statistics (another is the Bayesian information criterion, or BIC). AIC is defined as:


where Lm is the maximized log‐likelihood and m is the number of parameters in the given model. Lower values of AIC indicate a better‐fitting model than do larger values. Recall that the more parameters fit to a model, in general, the better will be the fit of that model. For example, a model that has a unique parameter for each data point would fit perfectly. This is the so‐called saturated model. AIC jointly considers both the goodness of fit as well as the number of parameters required to obtain the given fit, essentially “penalizing” for increasing the number of parameters unless they contribute to model fit. Adding one or more parameters to a model may cause −2Lm to decrease (which is a good thing substantively), but if the parameters are not worthwhile, this will be offset by an increase in 2m.

The Bayesian information criterion, or BIC (Schwarz, 1978) is defined as −2Lm + m log(N), where m, as before, is the number of parameters in the model and N the total number of observations used to fit the model. Lower values of BIC are also desirable when comparing models. BIC typically penalizes model complexity more heavily than AIC. For a comparison of AIC and BIC, see Burnham and Anderson (2011).

Applied Univariate, Bivariate, and Multivariate Statistics

Подняться наверх