[doc] Use cross references in sphinx doc. (#7522)

* Use cross references instead of URL.
* Fix auto doc for callback.
This commit is contained in:
Jiaming Yuan
2022-01-05 03:21:25 +08:00
committed by GitHub
parent eb1efb54b5
commit 54582f641a
11 changed files with 93 additions and 73 deletions

View File

@@ -16,10 +16,13 @@ Before running XGBoost, we must set three types of parameters: general parameter
:backlinks: none
:local:
.. _global_config:
********************
Global Configuration
********************
The following parameters can be set in the global scope, using ``xgb.config_context()`` (Python) or ``xgb.set.config()`` (R).
The following parameters can be set in the global scope, using :py:func:`xgboost.config_context()` (Python) or ``xgb.set.config()`` (R).
* ``verbosity``: Verbosity of printing messages. Valid values of 0 (silent), 1 (warning), 2 (info), and 3 (debug).
* ``use_rmm``: Whether to use RAPIDS Memory Manager (RMM) to allocate GPU memory. This option is only applicable when XGBoost is built (compiled) with the RMM plugin enabled. Valid values are ``true`` and ``false``.

View File

@@ -2,10 +2,11 @@
Callback Functions
##################
This document gives a basic walkthrough of callback function used in XGBoost Python
package. In XGBoost 1.3, a new callback interface is designed for Python package, which
provides the flexibility of designing various extension for training. Also, XGBoost has a
number of pre-defined callbacks for supporting early stopping, checkpoints etc.
This document gives a basic walkthrough of :ref:`callback API <callback_api>` used in
XGBoost Python package. In XGBoost 1.3, a new callback interface is designed for Python
package, which provides the flexibility of designing various extension for training.
Also, XGBoost has a number of pre-defined callbacks for supporting early stopping,
checkpoints etc.
Using builtin callbacks
@@ -14,8 +15,8 @@ Using builtin callbacks
By default, training methods in XGBoost have parameters like ``early_stopping_rounds`` and
``verbose``/``verbose_eval``, when specified the training procedure will define the
corresponding callbacks internally. For example, when ``early_stopping_rounds`` is
specified, ``EarlyStopping`` callback is invoked inside iteration loop. You can also pass
this callback function directly into XGBoost:
specified, :py:class:`EarlyStopping <xgboost.callback.EarlyStopping>` callback is invoked
inside iteration loop. You can also pass this callback function directly into XGBoost:
.. code-block:: python
@@ -54,6 +55,7 @@ this callback function directly into XGBoost:
Defining your own callback
--------------------------
XGBoost provides an callback interface class: ``xgboost.callback.TrainingCallback``, user
defined callbacks should inherit this class and override corresponding methods. There's a
working example in `demo/guide-python/callbacks.py <https://github.com/dmlc/xgboost/tree/master/demo/guide-python/callbacks.py>`_
XGBoost provides an callback interface class: :py:class:`TrainingCallback
<xgboost.callback.TrainingCallback>`, user defined callbacks should inherit this class and
override corresponding methods. There's a working example in
:ref:`sphx_glr_python_examples_callbacks.py`.

View File

@@ -77,15 +77,29 @@ Plotting API
Callback API
------------
.. autofunction:: xgboost.callback.TrainingCallback
.. automodule:: xgboost.callback
.. autoclass:: xgboost.callback.TrainingCallback
:members:
.. autofunction:: xgboost.callback.EvaluationMonitor
.. autoclass:: xgboost.callback.EvaluationMonitor
:members:
:inherited-members:
:show-inheritance:
.. autofunction:: xgboost.callback.EarlyStopping
.. autoclass:: xgboost.callback.EarlyStopping
:members:
:inherited-members:
:show-inheritance:
.. autofunction:: xgboost.callback.LearningRateScheduler
.. autoclass:: xgboost.callback.LearningRateScheduler
:members:
:inherited-members:
:show-inheritance:
.. autofunction:: xgboost.callback.TrainingCheckPoint
.. autoclass:: xgboost.callback.TrainingCheckPoint
:members:
:inherited-members:
:show-inheritance:
.. _dask_api:

View File

@@ -1,6 +1,6 @@
####################
XGBoost Tree Methods
####################
############
Tree Methods
############
For training boosted tree models, there are 2 parameters used for choosing algorithms,
namely ``updater`` and ``tree_method``. XGBoost has 4 builtin tree methods, namely

View File

@@ -146,7 +146,8 @@ We will be able to see XGBoost printing something like:
Notice that the parameter ``disable_default_eval_metric`` is used to suppress the default metric
in XGBoost.
For fully reproducible source code and comparison plots, see `custom_rmsle.py <https://github.com/dmlc/xgboost/tree/master/demo/guide-python/custom_rmsle.py>`_.
For fully reproducible source code and comparison plots, see
:ref:`sphx_glr_python_examples_custom_rmsle.py`.
*********************
Reverse Link Function
@@ -261,8 +262,7 @@ available in XGBoost:
We use ``multi:softmax`` to illustrate the differences of transformed prediction. With
``softprob`` the output prediction array has shape ``(n_samples, n_classes)`` while for
``softmax`` it's ``(n_samples, )``. A demo for multi-class objective function is also
available at `demo/guide-python/custom_softmax.py
<https://github.com/dmlc/xgboost/tree/master/demo/guide-python/custom_softmax.py>`_
available at :ref:`sphx_glr_python_examples_custom_softmax.py`.
**********************