Merge branch 'master' of ssh://github.com/tqchen/xgboost

This commit is contained in:
tqchen 2014-10-01 09:20:16 -07:00
commit c957e1a648
4 changed files with 37 additions and 8 deletions

View File

@ -14,6 +14,8 @@ Notes on the Code: [Code Guide](src)
What's New
=====
* Thanks to Bing Xu, [XGBoost.jl](https://github.com/antinucleon/XGBoost.jl) allows you to use xgboost from Julia
* See the updated [demo folder](demo) for feature walkthrough
* Thanks to Tong He, the new [R package](R-package) is available
@ -26,7 +28,6 @@ Features
* Speed: XGBoost is very fast
- IN [demo/higgs/speedtest.py](demo/kaggle-higgs/speedtest.py), kaggle higgs data it is faster(on our machine 20 times faster using 4 threads) than sklearn.ensemble.GradientBoostingClassifier
* Layout of gradient boosting algorithm to support user defined objective
* Python interface, works with numpy and scipy.sparse matrix
Build
=====

View File

@ -8,12 +8,30 @@ This folder contains the all example codes using xgboost.
Features Walkthrough
====
This is a list of short codes introducing different functionalities of xgboost and its wrapper.
* Basic walkthrough of wrappers [python](guide-python/basic_walkthrough.py)
* Cutomize loss function, and evaluation metric [python](guide-python/custom_objective.py)
* Boosting from existing prediction [python](guide-python/boost_from_prediction.py)
* Predicting using first n trees [python](guide-python/predict_first_ntree.py)
* Generalized Linear Model [python](guide-python/generalized_linear_model.py)
* Cross validation [python](guide-python/cross_validation.py)
* Basic walkthrough of wrappers
[python](guide-python/basic_walkthrough.py)
[R](../R-package/demo/basic_walkthrough.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/basic_walkthrough.jl)
* Cutomize loss function, and evaluation metric
[python](guide-python/custom_objective.py)
[R](../R-package/demo/custom_objective.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/custom_objective.jl)
* Boosting from existing prediction
[python](guide-python/boost_from_prediction.py)
[R](../R-package/demo/boost_from_prediction.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/boost_from_prediction.jl)
* Predicting using first n trees
[python](guide-python/predict_first_ntree.py)
[R](../R-package/demo/boost_from_prediction.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/boost_from_prediction.jl)
* Generalized Linear Model
[python](guide-python/generalized_linear_model.py)
[R](../R-package/demo/generalized_linear_model.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/generalized_linear_model.jl)
* Cross validation
[python](guide-python/cross_validation.py)
[R](../R-package/demo/cross_validation.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/cross_validation.jl)
Basic Examples by Tasks
====

View File

@ -1,3 +1,9 @@
Highlights
=====
Higgs challenge ends recently, xgboost is being used by many users. This list highlights the xgboost solutions of players
* Blogpost by phunther: [Winning solution of Kaggle Higgs competition: what a single model can do](http://no2147483647.wordpress.com/2014/09/17/winning-solution-of-kaggle-higgs-competition-what-a-single-model-can-do/)
Guide for Kaggle Higgs Challenge
=====

View File

@ -7,6 +7,10 @@ Python
* To make the python module, type ```make``` in the root directory of project
* Refer also to the walk through example in [demo folder](../demo/guide-python)
R
R
=====
* See [R-package](../R-package)
Julia
=====
* See [XGBoost.jl](https://github.com/antinucleon/XGBoost.jl)