Guard against index error in prediction. (#6982)

* Remove `best_ntree_limit` from documents.
This commit is contained in:
Jiaming Yuan
2021-05-25 23:24:59 +08:00
committed by GitHub
parent c6d87e5e18
commit 86e60e3ba8
5 changed files with 24 additions and 8 deletions

View File

@@ -165,9 +165,7 @@ def early_stop(stopping_rounds, maximize=False, verbose=True):
If there's more than one, will use the last.
Returns the model from the last iteration (not the best one).
If early stopping occurs, the model will have three additional fields:
``bst.best_score``, ``bst.best_iteration`` and ``bst.best_ntree_limit``.
(Use ``bst.best_ntree_limit`` to get the correct value if ``num_parallel_tree``
and/or ``num_class`` appears in the parameters)
``bst.best_score``, ``bst.best_iteration``.
Parameters
----------

View File

@@ -687,7 +687,7 @@ class XGBModel(XGBModelBase):
used for early stopping.
If early stopping occurs, the model will have three additional fields:
``clf.best_score``, ``clf.best_iteration`` and ``clf.best_ntree_limit``.
``clf.best_score``, ``clf.best_iteration``.
verbose :
If `verbose` and an evaluation set is used, writes the evaluation metric
measured on the validation set to stderr.

View File

@@ -144,10 +144,7 @@ def train(params, dtrain, num_boost_round=10, evals=(), obj=None, feval=None,
If there's more than one metric in the **eval_metric** parameter given in
**params**, the last metric will be used for early stopping.
If early stopping occurs, the model will have three additional fields:
``bst.best_score``, ``bst.best_iteration`` and ``bst.best_ntree_limit``. Use
``bst.best_ntree_limit`` to get the correct value if ``num_parallel_tree`` and/or
``num_class`` appears in the parameters. ``best_ntree_limit`` is the result of
``num_parallel_tree * best_iteration``.
``bst.best_score``, ``bst.best_iteration``.
evals_result: dict
This dictionary stores the evaluation results of all the items in watchlist.