Jiaming Yuan 615ab2b03e
Extract evaluate splits from CPU hist. (#7079)
Other than modularizing the split evaluation function, this PR also removes some more functions including `InitNewNodes` and `BuildNodeStats` among some other unused variables.  Also, scattered code like setting leaf weights is grouped into the split evaluator and `NodeEntry` is simplified and made private.  Another subtle difference with the original implementation is that the modified code doesn't call `tree[nidx].Parent()` to traversal upward.
2021-07-07 15:16:25 +08:00
2021-04-16 00:14:17 +08:00
2021-05-11 20:44:36 +08:00
2020-04-05 04:42:29 +08:00
2021-06-18 14:55:08 +08:00
2017-12-01 02:58:13 -08:00
2021-04-14 06:55:21 +08:00
2021-05-17 02:50:55 +08:00

eXtreme Gradient Boosting

Build Status Build Status XGBoost-CI Documentation Status GitHub license CRAN Status Badge PyPI version Conda version Optuna Twitter

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

License

© Contributors, 2019. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page.

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

Other sponsors

The sponsors in this list are donating cloud hours in lieu of cash donation.

Amazon Web Services

Description
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Readme 33 MiB
Languages
C++ 45.5%
Python 20.3%
Cuda 15.2%
R 6.8%
Scala 6.4%
Other 5.6%