* [dask] Use `distributed.MultiLock` This enables training multiple models in parallel. * Conditionally import `MultiLock`. * Use async train directly in scikit learn interface. * Use `worker_client` when available.
======================
XGBoost Python Package
======================
|PyPI version|
Installation
============
From `PyPI <https://pypi.python.org/pypi/xgboost>`_
---------------------------------------------------
For a stable version, install using ``pip``::
pip install xgboost
.. |PyPI version| image:: https://badge.fury.io/py/xgboost.svg
:target: http://badge.fury.io/py/xgboost
For building from source, see `build <https://xgboost.readthedocs.io/en/latest/build.html>`_.