Options (Multilayer Perceptron)
User-Missing Values. Factors must have valid values for a case to be included in the analysis. These controls allow you to decide whether user-missing values are treated as valid among factors and categorical dependent variables.
Stopping Rules. These are the rules that determine when to stop training the neural network. Training proceeds through at least one data pass. Training can then be stopped according to the following criteria, which are checked in the listed order. In the stopping rule definitions that follow, a step corresponds to a data pass for the online and mini-batch methods and an iteration for the batch method.
Maximum steps without
a decrease in error. The number of steps to allow before
checking for a decrease in error. If there is no decrease in error
after the specified number of steps, then training stops. Specify
an integer greater than 0. You can also specify which data sample
is used to compute the error. Choose automatically uses the testing sample if it exists and uses the training sample
otherwise. Note that batch training guarantees a decrease in the training
sample error after each data pass; thus, this option applies only
to batch training if a testing sample exists. Both training and test data checks the error for each
of these samples; this option applies only if a testing sample exits.
Note: After each complete data pass, online and mini-batch training require an extra data pass in order to compute the training error. This extra data pass can slow training considerably, so it is generally recommended that you supply a testing sample and select Choose automatically in any case.
- Maximum training time. Choose whether to specify a maximum number of minutes for the algorithm to run. Specify a number greater than 0.
- Maximum Training Epochs. The maximum number of epochs (data passes) allowed. If the maximum number of epochs is exceeded, then training stops. Specify an integer greater than 0.
- Minimum relative change in training error. Training stops if the relative change in the training error compared to the previous step is less than the criterion value. Specify a number greater than 0. For online and mini-batch training, this criterion is ignored if only testing data is used to compute the error.
- Minimum relative change in training error ratio. Training stops if the ratio of the training error to the error of the null model is less than the criterion value. The null model predicts the average value for all dependent variables. Specify a number greater than 0. For online and mini-batch training, this criterion is ignored if only testing data is used to compute the error.
Maximum cases to store in memory. This controls the following settings within the multilayer perceptron algorithms. Specify an integer greater than 1.
- In automatic architecture selection, the size of the sample used to determine the network archicteture is min(1000,memsize), where memsize is the maximum number of cases to store in memory.
- In mini-batch training with automatic computation of the number of mini-batches, the number of mini-batches is min(max(M/10,2),memsize), where M is the number of cases in the training sample.
How To Set Options for Multilayer Perceptron
This feature requires the Neural Networks option.
- From the menus choose:
- In the Multilayer Perceptron dialog box, click the Options tab.