This commit is contained in:
tqchen 2019-04-29 17:50:46 -07:00
parent 5e582b0fa7
commit 91c513a0c1

View File

@ -72,8 +72,7 @@ Decision Tree Ensembles
***********************
Now that we have introduced the elements of supervised learning, let us get started with real trees.
To begin with, let us first learn about the model choice of XGBoost: **decision tree ensembles**.
The tree ensemble model consists of a set of classification and regression trees (CART). Here's a simple example of a CART
that classifies whether someone will like computer games.
The tree ensemble model consists of a set of classification and regression trees (CART). Here's a simple example of a CART that classifies whether someone will like a hypothetical computer game X.
.. image:: https://raw.githubusercontent.com/dmlc/web-data/master/xgboost/model/cart.png
:width: 100%
@ -257,7 +256,7 @@ A left to right scan is sufficient to calculate the structure score of all possi
.. note:: Limitation of additive tree learning
Since it is intractable to enumerate all possible tree structures, we add one split at a time. This approach works well most of the time, but there are some edge cases that fail due to this approach. For those edge cases, training results in a degenerate model because we consider only one feature dimension at a time. See `Can Gradient Boosting Learn Simple Arithmetic? <http://mariofilho.com/can-gradient-boosting-learn-simple-arithmetic/>`_ for an example.
Since it is intractable to enumerate all possible tree structures, we add one split at a time. This approach works well most of the time, but there are some edge cases that fail due to this approach. For those edge cases, training results in a degenerate model because we consider only one feature dimension at a time. See `Can Gradient Boosting Learn Simple Arithmetic? <http://mariofilho.com/can-gradient-boosting-learn-simple-arithmetic/>`_ for an example.
**********************
Final words on XGBoost