Support slicing tree model (#6302)
This PR is meant the end the confusion around best_ntree_limit and unify model slicing. We have multi-class and random forests, asking users to understand how to set ntree_limit is difficult and error prone. * Implement the save_best option in early stopping. Co-authored-by: Philip Hyunsu Cho <chohyu01@cs.washington.edu>
This commit is contained in:
@@ -7,9 +7,9 @@ package. In XGBoost 1.3, a new callback interface is designed for Python packag
|
||||
provides the flexiblity of designing various extension for training. Also, XGBoost has a
|
||||
number of pre-defined callbacks for supporting early stopping, checkpoints etc.
|
||||
|
||||
#######################
|
||||
|
||||
Using builtin callbacks
|
||||
#######################
|
||||
-----------------------
|
||||
|
||||
By default, training methods in XGBoost have parameters like ``early_stopping_rounds`` and
|
||||
``verbose``/``verbose_eval``, when specified the training procedure will define the
|
||||
@@ -50,9 +50,9 @@ this callback function directly into XGBoost:
|
||||
dump = booster.get_dump(dump_format='json')
|
||||
assert len(early_stop.stopping_history['Valid']['CustomErr']) == len(dump)
|
||||
|
||||
##########################
|
||||
|
||||
Defining your own callback
|
||||
##########################
|
||||
--------------------------
|
||||
|
||||
XGBoost provides an callback interface class: ``xgboost.callback.TrainingCallback``, user
|
||||
defined callbacks should inherit this class and override corresponding methods. There's a
|
||||
|
||||
Reference in New Issue
Block a user