From a512b4b394d1f55adeeb8fc9ca2b9a386d3bca79 Mon Sep 17 00:00:00 2001 From: Jiaming Yuan Date: Thu, 16 Dec 2021 14:17:06 +0800 Subject: [PATCH] [doc] Promote dask from experimental. [skip ci] (#7509) --- doc/python/python_intro.rst | 15 +++++++++++++-- doc/tutorials/dask.rst | 7 +++---- 2 files changed, 16 insertions(+), 6 deletions(-) diff --git a/doc/python/python_intro.rst b/doc/python/python_intro.rst index e31f705ea..054598873 100644 --- a/doc/python/python_intro.rst +++ b/doc/python/python_intro.rst @@ -1,13 +1,23 @@ ########################### Python Package Introduction ########################### -This document gives a basic walkthrough of the xgboost package for Python. + +This document gives a basic walkthrough of the xgboost package for Python. The Python +package is consisted of 3 different interfaces, including native interface, scikit-learn +interface and dask interface. For introduction to dask interface please see +:doc:`/tutorials/dask`. **List of other Helpful Links** * :doc:`/python/examples/index` * :doc:`Python API Reference ` +**Contents** + +.. contents:: + :backlinks: none + :local: + Install XGBoost --------------- To install XGBoost, follow instructions in :doc:`/install`. @@ -22,7 +32,8 @@ To verify your installation, run the following in Python: Data Interface -------------- -The XGBoost python module is able to load data from many types of different formats, including: +The XGBoost python module is able to load data from many different types of data format, +including: - NumPy 2D array - SciPy 2D sparse array diff --git a/doc/tutorials/dask.rst b/doc/tutorials/dask.rst index e27f95624..efe455b1e 100644 --- a/doc/tutorials/dask.rst +++ b/doc/tutorials/dask.rst @@ -3,11 +3,10 @@ Distributed XGBoost with Dask ############################# `Dask `_ is a parallel computing library built on Python. Dask allows -easy management of distributed workers and excels at handling large distributed data science -workflows. The implementation in XGBoost originates from `dask-xgboost +easy management of distributed workers and excels at handling large distributed data +science workflows. The implementation in XGBoost originates from `dask-xgboost `_ with some extended functionalities and a -different interface. Right now it is still under construction and may change (with proper -warnings) in the future. The tutorial here focuses on basic usage of dask with CPU tree +different interface. The tutorial here focuses on basic usage of dask with CPU tree algorithms. For an overview of GPU based training and internal workings, see `A New, Official Dask API for XGBoost `_.