2014-08-22 16:49:42 -07:00
2014-08-18 12:20:13 -07:00
2014-08-22 16:47:50 -07:00
2014-08-17 22:49:36 -07:00
2014-05-16 20:46:08 -07:00
2014-05-15 20:28:34 -07:00
2014-08-18 12:20:13 -07:00
add
2014-08-22 16:49:42 -07:00

xgboost: eXtreme Gradient Boosting

An optimized general purpose gradient boosting (tree) library.

Contributors: https://github.com/tqchen/xgboost/graphs/contributors

Turorial and Documentation: https://github.com/tqchen/xgboost/wiki

Questions and Issues: https://github.com/tqchen/xgboost/issues

Features

  • Sparse feature format:
    • Sparse feature format allows easy handling of missing values, and improve computation efficiency.
  • Push the limit on single machine:
    • Efficient implementation that optimizes memory and computation.
  • Speed: XGBoost is very fast
    • IN demo/higgs/speedtest.py, kaggle higgs data it is faster(on our machine 20 times faster using 4 threads) than sklearn.ensemble.GradientBoostingClassifier
  • Layout of gradient boosting algorithm to support user defined objective
  • Python interface, works with numpy and scipy.sparse matrix

xgboost-unity

  • Experimental branch(not usable yet): refactor xgboost, cleaner code, more flexibility
  • This version of xgboost is not compatible with 0.2x, due to huge amount of changes in code structure
    • This means the model and buffer file of previous version can not be loaded in xgboost-unity

Build

  • Simply type make
  • If your compiler does not come with OpenMP support, it will fire an warning telling you that the code will compile into single thread mode, and you will get single thread xgboost
  • You may get a error: -lgomp is not found
    • You can type make no_omp=1, this will get you single thread xgboost
    • Alternatively, you can upgrade your compiler to compile multi-thread version
  • Possible way to build using Visual Studio (not tested):
    • In principle, you can put src/xgboost.cpp and src/io/io.cpp into the project, and build xgboost.
    • For python module, you need python/xgboost_wrapper.cpp and src/io/io.cpp to build a dll.

Try Graphlab Create Version

  • Graphlab Create(GLC) is a scalable machine learning toolkit that allows you to deal with big data in python
  • XGBoost is adopted by boosted tree library in GLC. The GLC version allows you to do feature engineering, hyper-parameter searching and visualization in one framework. See the nice blogpost about predicting bike sharing demand.
Description
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Readme 33 MiB
Languages
C++ 45.5%
Python 20.3%
Cuda 15.2%
R 6.8%
Scala 6.4%
Other 5.6%