Linear Mixed Models Estimation

Method
Select the maximum likelihood or restricted maximum likelihood estimation.
Degrees of freedom
Provides options for defining the degrees of freedom for all tests.
Residual method
The residual method has a fixed degrees of freedom for all tests. It is useful if your sample size is sufficiently large, or the data are balanced, or the model uses a simpler covariance type (for example, scaled identity or diagonal).
Satterthwaite approximation
The Satterthwaite method has a field degrees of freedom across tests. It is useful if your sample size is small, or the data are unbalanced, or the model uses a complicated covariance type (for example, unstructured).
Kenward-Roger approximation
The Kenward-Roger method offers a more precise small-sample estimator for the variance-covariance of the fixed effects parameters and the approximate denominator degrees of freedom in t-tests and F-tests. The method introduces a scale factor for the F-statistic and estimates it and the denominator degrees of freedom by using a Taylor series expansion for the estimated random structure within the data.
Note: The Kenward-Roger method is used in the model based covariance (instead of robust covariance). When both the Kenward-Roger method and robust covariance is selected, the Kenward-Roger method is applied to model based covariance, and the following warning is presented: “Since Kenward-Roger method is selected, the robust covariance method is changed to model-based covariance method”.
Iterations
The following options are available:
Maximum iterations
Specify a non-negative integer.
Maximum step-halvings
At each iteration, the step size is reduced by a factor of 0.5 until the log-likelihood increases or maximum step-halving is reached. Specify a positive integer.
Print iteration history for every n step(s)
Displays a table containing the log-likelihood function value and parameter estimates at every n iteration beginning with the 0th iteration (the initial estimates). If you choose to print the iteration history, the last iteration is always printed regardless of the value of n.
Log-Likelihood Convergence
Convergence is assumed if the absolute change or relative change in the log-likelihood function is less than the value specified, which must be non-negative. The criterion is not used if the value specified equals 0.
Parameter Convergence
Convergence is assumed if the maximum absolute change or maximum relative change in the parameter estimates is less than the value specified, which must be non-negative. The criterion is not used if the value specified equals 0.
Hessian Convergence
For the Absolute specification, convergence is assumed if a statistic based on the Hessian is less than the value specified. For the Relative specification, convergence is assumed if the statistic is less than the product of the value specified and the absolute value of the log-likelihood. The criterion is not used if the value specified equals 0.
Maximum scoring steps
Requests to use the Fisher scoring algorithm up to iteration number n. Specify a non-negative integer.
Singularity tolerance
This value is used as the tolerance in checking singularity. Specify a positive value.

Specifying Estimation Criteria for Linear Mixed Models

This feature requires SPSS® Statistics Standard Edition or the Advanced Statistics Option.

  1. From the menus choose:

    Analyze > Mixed Models > Linear...

  2. Optionally, select subjects and repeated variables, and then click Continue.
  3. In the Linear Mixed Models dialog box, click Estimation.
  4. Select the estimation criteria that you want.