xgboost/demo/rmm_plugin/rmm_singlegpu.py
Jiaming Yuan ca998df912
Clarify the behavior of use_rmm. (#6808)
* Clarify the `use_rmm` flag in document and demo.
2021-03-31 15:43:11 +08:00

22 lines
650 B
Python

import xgboost as xgb
import rmm
from sklearn.datasets import make_classification
# Initialize RMM pool allocator
rmm.reinitialize(pool_allocator=True)
# Optionally force XGBoost to use RMM for all GPU memory allocation, see ./README.md
# xgb.set_config(use_rmm=True)
X, y = make_classification(n_samples=10000, n_informative=5, n_classes=3)
dtrain = xgb.DMatrix(X, label=y)
params = {
"max_depth": 8,
"eta": 0.01,
"objective": "multi:softprob",
"num_class": 3,
"tree_method": "gpu_hist",
}
# XGBoost will automatically use the RMM pool allocator
bst = xgb.train(params, dtrain, num_boost_round=100, evals=[(dtrain, "train")])