Delay breaking changes to 1.6. (#7420)

The patch is too big to be backported.
This commit is contained in:
Jiaming Yuan 2021-11-12 16:46:03 +08:00 committed by GitHub
parent cb685607b2
commit 97d7582457
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 14 additions and 13 deletions

View File

@ -21,6 +21,7 @@ concepts should be readily applicable to other language bindings.
.. note::
* The ranking task does not support customized functions.
* Breaking change was made in XGBoost 1.6.
In the following two sections, we will provide a step by step walk through of implementing
``Squared Log Error(SLE)`` objective function:
@ -270,7 +271,7 @@ Scikit-Learn Interface
The scikit-learn interface of XGBoost has some utilities to improve the integration with
standard scikit-learn functions. For instance, after XGBoost 1.5.1 users can use the cost
standard scikit-learn functions. For instance, after XGBoost 1.6.0 users can use the cost
function (not scoring functions) from scikit-learn out of the box:
.. code-block:: python

View File

@ -199,7 +199,7 @@ __model_doc = f'''
eval_metric : Optional[Union[str, List[str], Callable]]
.. versionadded:: 1.5.1
.. versionadded:: 1.6.0
Metric used for monitoring the training result and early stopping. It can be a
string or list of strings as names of predefined metric in XGBoost (See
@ -239,7 +239,7 @@ __model_doc = f'''
early_stopping_rounds : Optional[int]
.. versionadded:: 1.5.1
.. versionadded:: 1.6.0
Activates early stopping. Validation metric needs to improve at least once in
every **early_stopping_rounds** round(s) to continue training. Requires at least
@ -855,11 +855,11 @@ class XGBModel(XGBModelBase):
Validation metrics will help us track the performance of the model.
eval_metric : str, list of str, or callable, optional
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `eval_metric` in :py:meth:`__init__` or :py:meth:`set_params` instead.
early_stopping_rounds : int
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `early_stopping_rounds` in :py:meth:`__init__` or
:py:meth:`set_params` instead.
verbose :
@ -881,7 +881,7 @@ class XGBModel(XGBModelBase):
`exact` tree methods.
callbacks :
.. deprecated: 1.5.1
.. deprecated: 1.6.0
Use `callbacks` in :py:meth:`__init__` or :py:methd:`set_params` instead.
"""
evals_result: TrainingCallback.EvalsLog = {}
@ -1693,11 +1693,11 @@ class XGBRanker(XGBModel, XGBRankerMixIn):
pair in **eval_set**.
eval_metric : str, list of str, optional
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
use `eval_metric` in :py:meth:`__init__` or :py:meth:`set_params` instead.
early_stopping_rounds : int
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
use `early_stopping_rounds` in :py:meth:`__init__` or
:py:meth:`set_params` instead.
@ -1727,7 +1727,7 @@ class XGBRanker(XGBModel, XGBRankerMixIn):
`exact` tree methods.
callbacks :
.. deprecated: 1.5.1
.. deprecated: 1.6.0
Use `callbacks` in :py:meth:`__init__` or :py:methd:`set_params` instead.
"""
# check if group information is provided

View File

@ -80,7 +80,7 @@ def train(
<https://xgboost.readthedocs.io/en/latest/tutorials/custom_metric_obj.html>`_ for
details.
feval :
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `custom_metric` instead.
maximize : bool
Whether to maximize feval.
@ -132,7 +132,7 @@ def train(
custom_metric:
.. versionadded 1.5.1
.. versionadded 1.6.0
Custom metric function. See `Custom Metric
<https://xgboost.readthedocs.io/en/latest/tutorials/custom_metric_obj.html>`_ for
@ -392,7 +392,7 @@ def cv(params, dtrain, num_boost_round=10, nfold=3, stratified=False, folds=None
details.
feval : function
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `custom_metric` instead.
maximize : bool
Whether to maximize feval.
@ -432,7 +432,7 @@ def cv(params, dtrain, num_boost_round=10, nfold=3, stratified=False, folds=None
Shuffle data before creating folds.
custom_metric :
.. versionadded 1.5.1
.. versionadded 1.6.0
Custom metric function. See `Custom Metric
<https://xgboost.readthedocs.io/en/latest/tutorials/custom_metric_obj.html>`_ for