* Change C API name.
* Test for all primitive types from array.
* Add native support for CPU 128 float.
* Convert boolean and float16 in Python.
* Fix dask version for now.
* Ensure RMM is 0.18 or later
* Add use_rmm flag to global configuration
* Modify XGBCachingDeviceAllocatorImpl to skip CUB when use_rmm=True
* Update the demo
* [CI] Pin NumPy to 1.19.4, since NumPy 1.19.5 doesn't work with latest Shap
* Add a new API function for predicting on `DMatrix`. This function aligns
with rest of the `XGBoosterPredictFrom*` functions on semantic of function
arguments.
* Purge `ntree_limit` from libxgboost, use iteration instead.
* [dask] Use `inplace_predict` by default for dask sklearn models.
* [dask] Run prediction shape inference on worker instead of client.
The breaking change is in the Python sklearn `apply` function, I made it to be
consistent with other prediction functions where `best_iteration` is used by
default.
* Vendor libgomp in the manylinux2014_aarch64 wheel
* Use vault repo, since CentOS 6 has reached End-of-Life on Nov 30
* Vendor libgomp in the manylinux2010_x86_64 wheel
* Run verification step inside the container
* Modin DF support
* mode change
* tests were added, ci env was extended
* mode change
* Remove redundant installation of modin
* Add a pytest skip marker for modin
* Install Modin[ray] from PyPI
* fix interfering
* avoid extra conversion
* delete cv test for modin
* revert cv function
Co-authored-by: ShvetsKS <kirill.shvets@intel.com>
Co-authored-by: Hyunsu Cho <chohyu01@cs.washington.edu>