[doc] Brief note about RMM SAM allocator. [skip ci] (#10712)

This commit is contained in:
Jiaming Yuan
2024-08-17 04:21:39 +08:00
committed by GitHub
parent ec3f327c20
commit fd365c147e
2 changed files with 22 additions and 1 deletions

View File

@@ -50,6 +50,11 @@ Multi-node Multi-GPU Training
XGBoost supports fully distributed GPU training using `Dask <https://dask.org/>`_, ``Spark`` and ``PySpark``. For getting started with Dask see our tutorial :doc:`/tutorials/dask` and worked examples :doc:`/python/dask-examples/index`, also Python documentation :ref:`dask_api` for complete reference. For usage with ``Spark`` using Scala see :doc:`/jvm/xgboost4j_spark_gpu_tutorial`. Lastly for distributed GPU training with ``PySpark``, see :doc:`/tutorials/spark_estimator`.
RMM integration
===============
XGBoost provides optional support for RMM integration. See :doc:`/python/rmm-examples/index` for more info.
Memory usage
============