Developer Guide for Intel® Data Analytics Acceleration Library 2019 Update 5
To build a Gradient Boosted Trees Regression model using methods of the Model Builder class of Gradient Boosted Tree Regression, complete the following steps:
Each tree consists of internal nodes (called non-leaf or split nodes) and external nodes (leaf nodes). Each split node denotes a feature test that is a Boolean expression, for example, f < featureValue or f = featureValue, where f is a feature and featureValue is a constant. The test type depends on the feature type: continuous, categorical, or ordinal. For more information on the test types, see Algorithms > Training and Prediction > Classification and Regression > Decision Tree > Details.
The inducted decision tree is a binary tree, meaning that each non-leaf node has exactly two branches: true and false. Each split node contains featureIndex, the index of the feature used for the feature test in this node, and featureValue, the constant for the Boolean expression in the test. Each leaf node contains a classLabel, the predicted class for this leaf. For more information on decision trees, see Algorithms > Training and Prediction > Classification and Regression > Decision Tree.
Add nodes to the created tree in accordance with the pre-calculated structure of the tree. Check that the leaf nodes do not have children nodes and that the splits have exactly two children.
C++:
Java*:
Python*: