XGBoost Linear node Build Options

Use the Build Options tab to specify build options for the XGBoost Linear node, including basic options such as linear boost parameters and model building, and learning task options for objectives. For additional information about these options, see the following online resources:

Basic

Hyper-Parameter Optimization (Based on Rbfopt). Select this option to enable Hyper-Parameter Optimization based on Rbfopt, which automatically discovers the optimal combination of parameters so that the model will achieve the expected or lower error rate on the samples. For details about Rbfopt, see http://rbfopt.readthedocs.io/en/latest/rbfopt_settings.html.

Alpha. L1 regularization term on weights Increasing this value will make model more conservative.

Lambda. L2 regularization term on weights. Increasing this value will make the model more conservative.

Lambda bias. L2 regularization term on bias. (There is no L1 regularization term on bias because it is not important.)

Number boost round. The number of boosting iterations.

Learning Task

Objective. Select from the following learning task objective types: reg:linear, reg:logistic, reg:gamma, reg:tweedie, count:poisson, rank:pairwise, binary:logistic, or multi.

Random Seed. You can click Generate to generate the seed used by the random number generator.

The following table shows the relationship between the settings in the SPSS® Modeler XGBoost Linear node dialog and the Python XGBoost library parameters.
Table 1. Node properties mapped to Python library parameters
SPSS Modeler setting Script name (property name) XGBoost parameter
Target TargetField
Predictors InputFields
Lambda lambda lambda
Alpha alpha alpha
Lambda bias lambdaBias lambda_bias
Num boost round numBoostRound num_boost_round
Objective objectiveType objective
Random Seed random_seed seed

1 "XGBoost Parameters" Scalable and Flexible Gradient Boosting. Web. © 2015-2016 DMLC.

2 "Plotting API" Scalable and Flexible Gradient Boosting. Web. © 2015-2016 DMLC.

3 "Scalable and Flexible Gradient Boosting." Web. © 2015-2016 DMLC.