57 lines
3.8 KiB
Markdown
57 lines
3.8 KiB
Markdown
<img src="https://xgboost.ai/images/logo/xgboost-logo-ng-trimmed.png" width=200/> eXtreme Gradient Boosting
|
|
===========
|
|
|
|
[](https://buildkite.com/xgboost/xgboost-ci)
|
|
[](https://github.com/dmlc/xgboost/actions)
|
|
[](https://xgboost.readthedocs.org)
|
|
[](./LICENSE)
|
|
[](http://cran.r-project.org/web/packages/xgboost)
|
|
[](https://pypi.python.org/pypi/xgboost/)
|
|
[](https://anaconda.org/conda-forge/py-xgboost)
|
|
[](https://optuna.org)
|
|
[](https://twitter.com/XGBoostProject)
|
|
[](https://api.securityscorecards.dev/projects/github.com/dmlc/xgboost)
|
|
|
|
[Community](https://xgboost.ai/community) |
|
|
[Documentation](https://xgboost.readthedocs.org) |
|
|
[Resources](demo/README.md) |
|
|
[Contributors](CONTRIBUTORS.md) |
|
|
[Release Notes](NEWS.md)
|
|
|
|
XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***.
|
|
It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework.
|
|
XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
|
|
The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, Dask, Spark, PySpark) and can solve problems beyond billions of examples.
|
|
|
|
License
|
|
-------
|
|
© Contributors, 2021. Licensed under an [Apache-2](https://github.com/dmlc/xgboost/blob/master/LICENSE) license.
|
|
|
|
Contribute to XGBoost
|
|
---------------------
|
|
XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone.
|
|
Checkout the [Community Page](https://xgboost.ai/community).
|
|
|
|
Reference
|
|
---------
|
|
- Tianqi Chen and Carlos Guestrin. [XGBoost: A Scalable Tree Boosting System](http://arxiv.org/abs/1603.02754). In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
|
|
- XGBoost originates from research project at University of Washington.
|
|
|
|
Sponsors
|
|
--------
|
|
Become a sponsor and get a logo here. See details at [Sponsoring the XGBoost Project](https://xgboost.ai/sponsors). The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).
|
|
|
|
## Open Source Collective sponsors
|
|
[](#backers) [](#sponsors)
|
|
|
|
### Sponsors
|
|
[[Become a sponsor](https://opencollective.com/xgboost#sponsor)]
|
|
|
|
<a href="https://www.nvidia.com/en-us/" target="_blank"><img src="https://raw.githubusercontent.com/xgboost-ai/xgboost-ai.github.io/master/images/sponsors/nvidia.jpg" alt="NVIDIA" width="72" height="72"></a>
|
|
<a href="https://www.intel.com/" target="_blank"><img src="https://images.opencollective.com/intel-corporation/2fa85c1/logo/256.png" width="72" height="72"></a>
|
|
|
|
### Backers
|
|
[[Become a backer](https://opencollective.com/xgboost#backer)]
|
|
|
|
<a href="https://opencollective.com/xgboost#backers" target="_blank"><img src="https://opencollective.com/xgboost/backers.svg?width=890"></a>
|