diff --git a/README.md b/README.md index d69189824..41085afdf 100644 --- a/README.md +++ b/README.md @@ -14,6 +14,8 @@ Notes on the Code: [Code Guide](src) What's New ===== + +* Thanks to Bing Xu, [XGBoost.jl](https://github.com/antinucleon/XGBoost.jl) allows you to use xgboost from Julia * See the updated [demo folder](demo) for feature walkthrough * Thanks to Tong He, the new [R package](R-package) is available @@ -26,7 +28,6 @@ Features * Speed: XGBoost is very fast - IN [demo/higgs/speedtest.py](demo/kaggle-higgs/speedtest.py), kaggle higgs data it is faster(on our machine 20 times faster using 4 threads) than sklearn.ensemble.GradientBoostingClassifier * Layout of gradient boosting algorithm to support user defined objective -* Python interface, works with numpy and scipy.sparse matrix Build ===== diff --git a/demo/README.md b/demo/README.md index 5414e642e..bcc356712 100644 --- a/demo/README.md +++ b/demo/README.md @@ -8,12 +8,30 @@ This folder contains the all example codes using xgboost. Features Walkthrough ==== This is a list of short codes introducing different functionalities of xgboost and its wrapper. -* Basic walkthrough of wrappers [python](guide-python/basic_walkthrough.py) -* Cutomize loss function, and evaluation metric [python](guide-python/custom_objective.py) -* Boosting from existing prediction [python](guide-python/boost_from_prediction.py) -* Predicting using first n trees [python](guide-python/predict_first_ntree.py) -* Generalized Linear Model [python](guide-python/generalized_linear_model.py) -* Cross validation [python](guide-python/cross_validation.py) +* Basic walkthrough of wrappers + [python](guide-python/basic_walkthrough.py) + [R](../R-package/demo/basic_walkthrough.R) + [Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/basic_walkthrough.jl) +* Cutomize loss function, and evaluation metric + [python](guide-python/custom_objective.py) + [R](../R-package/demo/custom_objective.R) + [Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/custom_objective.jl) +* Boosting from existing prediction + [python](guide-python/boost_from_prediction.py) + [R](../R-package/demo/boost_from_prediction.R) + [Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/boost_from_prediction.jl) +* Predicting using first n trees + [python](guide-python/predict_first_ntree.py) + [R](../R-package/demo/boost_from_prediction.R) + [Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/boost_from_prediction.jl) +* Generalized Linear Model + [python](guide-python/generalized_linear_model.py) + [R](../R-package/demo/generalized_linear_model.R) + [Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/generalized_linear_model.jl) +* Cross validation + [python](guide-python/cross_validation.py) + [R](../R-package/demo/cross_validation.R) + [Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/cross_validation.jl) Basic Examples by Tasks ==== diff --git a/demo/kaggle-higgs/README.md b/demo/kaggle-higgs/README.md index 2d9e2fd01..ce1a749fa 100644 --- a/demo/kaggle-higgs/README.md +++ b/demo/kaggle-higgs/README.md @@ -1,3 +1,9 @@ +Highlights +===== +Higgs challenge ends recently, xgboost is being used by many users. This list highlights the xgboost solutions of players +* Blogpost by phunther: [Winning solution of Kaggle Higgs competition: what a single model can do](http://no2147483647.wordpress.com/2014/09/17/winning-solution-of-kaggle-higgs-competition-what-a-single-model-can-do/) + + Guide for Kaggle Higgs Challenge ===== diff --git a/wrapper/README.md b/wrapper/README.md index e736b9b6a..09851b97f 100644 --- a/wrapper/README.md +++ b/wrapper/README.md @@ -7,6 +7,10 @@ Python * To make the python module, type ```make``` in the root directory of project * Refer also to the walk through example in [demo folder](../demo/guide-python) -R +R ===== * See [R-package](../R-package) + +Julia +===== +* See [XGBoost.jl](https://github.com/antinucleon/XGBoost.jl)