Information Criteria

This table contains measures for selecting and comparing mixed models.
The -2 Restricted Log Likelihood is the most basic measure for model selection.
The other four measures are modifications of the Log Likelihood which penalize more complex models.
- Akaike's Information Criterion (AIC) adjusts the -2 Restricted Log Likelihood by twice the number of parameters in the model.
- Hurvich and Tsai's Criterion (AICC) is a correction for the AIC when the sample size is small. As the sample size increases, the AICC converges to the AIC.
- Schwartz's Bayesian Criterion (BIC) has a stronger penalty than the AIC for overparametrized models, and adjusts the -2 Restricted Log Likelihood by the number of parameters times the log of the number of cases. It is also known as the Bayesian Information Criterion.
- Bozdogan's Criterion (CAIC) has a stronger penalty than the AIC for overparametrized models, and adjusts the -2 Restricted Log Likelihood by the number of parameters times one plus the log of the number of cases. As the sample size increases, the CAIC converges to the BIC.

Smaller values indicate better models, so these measures show that the model with repeated effects fits the data considerably better than the model without. Thus, the added complexity of modeling the covariance structure has paid off.