add TOC, simplied text in the solution section
This commit is contained in:
parent
d9614dfbe8
commit
ef7d26eb07
@ -1,15 +1,33 @@
|
|||||||
#Awesome XGBoost
|
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
|
||||||
|
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
|
||||||
|
**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)*
|
||||||
|
|
||||||
Welcome to the wonderland of XGBoost. This page contains a curated list of awesome XGBoost examples, tutorials and blogs. It is inspired by [awesom-MXnet](https://github.com/dmlc/mxnet/blob/master/example/README.md), [awesome-php](https://github.com/ziadoz/awesome-php) and [awesome-machine-learning](https://github.com/josephmisiti/awesome-machine-learning).
|
- [Awesome XGBoost](#awesome-xgboost)
|
||||||
|
- [Contributing](#contributing)
|
||||||
|
- [Examples](#examples)
|
||||||
|
- [Features Walkthrough](#features-walkthrough)
|
||||||
|
- [Basic Examples by Tasks](#basic-examples-by-tasks)
|
||||||
|
- [Benchmarks](#benchmarks)
|
||||||
|
- [Machine Learning Challenge Winning Solutions](#machine-learning-challenge-winning-solutions)
|
||||||
|
- [Tutorials](#tutorials)
|
||||||
|
- [Tools with XGBoost](#tools-with-xgboost)
|
||||||
|
- [Services Powered by XGBoost](#services-powered-by-xgboost)
|
||||||
|
- [Awards](#awards)
|
||||||
|
|
||||||
## Contributing
|
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
|
||||||
|
|
||||||
|
Awesome XGBoost
|
||||||
|
======
|
||||||
|
Welcome to the wonderland of XGBoost. This page contains a curated list of awesome XGBoost examples, tutorials and blogs. It is inspired by [awesome-MXNet](https://github.com/dmlc/mxnet/blob/master/example/README.md), [awesome-php](https://github.com/ziadoz/awesome-php) and [awesome-machine-learning](https://github.com/josephmisiti/awesome-machine-learning).
|
||||||
|
|
||||||
|
Contributing
|
||||||
|
----
|
||||||
* Contribution of examples, benchmarks is more than welcome!
|
* Contribution of examples, benchmarks is more than welcome!
|
||||||
* If you like to share how you use xgboost to solve your problem, send a pull request:)
|
* If you like to share how you use xgboost to solve your problem, send a pull request:)
|
||||||
* If you want to contribute to this list and the examples, please open a new pull request.
|
* If you want to contribute to this list and the examples, please open a new pull request.
|
||||||
|
|
||||||
##List of examples
|
Examples
|
||||||
|
----
|
||||||
### Features Walkthrough
|
### Features Walkthrough
|
||||||
|
|
||||||
This is a list of short codes introducing different functionalities of xgboost packages.
|
This is a list of short codes introducing different functionalities of xgboost packages.
|
||||||
@ -62,19 +80,22 @@ However, the parameter settings can be applied to all versions
|
|||||||
|
|
||||||
"Over the last six months, a new algorithm has come up on Kaggle __winning every single competition__ in this category, it is an algorithm called __XGBoost__." -- Anthony Goldbloom, Founder & CEO of Kaggle (from his presentation "What Is Winning on Kaggle?" [youtube link](https://youtu.be/GTs5ZQ6XwUM?t=7m7s))
|
"Over the last six months, a new algorithm has come up on Kaggle __winning every single competition__ in this category, it is an algorithm called __XGBoost__." -- Anthony Goldbloom, Founder & CEO of Kaggle (from his presentation "What Is Winning on Kaggle?" [youtube link](https://youtu.be/GTs5ZQ6XwUM?t=7m7s))
|
||||||
|
|
||||||
* XGBoost helps Marios Michailidis, Mathias Müller and HJ van Veen to win (1st place) the [Dato Truely Native? competition](https://www.kaggle.com/c/dato-native). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/12/03/dato-winners-interview-1st-place-mad-professors/).
|
XGBoost has helped on these winning solutions:
|
||||||
* XGBoost helps Vlad Mironov, Alexander Guschin to win (1st place) the [CERN LHCb experiment Flavour of Physics competition](https://www.kaggle.com/c/flavours-of-physics). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/11/30/flavour-of-physics-technical-write-up-1st-place-go-polar-bears/).
|
|
||||||
* XGBoost helps Josef Slavicek to win (3rd place) the [CERN LHCb experiment Flavour of Physics competition](https://www.kaggle.com/c/flavours-of-physics). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/11/23/flavour-of-physics-winners-interview-3rd-place-josef-slavicek/).
|
* Marios Michailidis, Mathias Müller and HJ van Veen, 1st place of the [Dato Truely Native? competition](https://www.kaggle.com/c/dato-native). Link to [the Kaggle interview](http://blog.kaggle.com/2015/12/03/dato-winners-interview-1st-place-mad-professors/).
|
||||||
* XGBoost helps Mario Filho, Josef Feigl, Lucas, Gilberto to win (1st place) the [Caterpillar Tube Pricing competition](https://www.kaggle.com/c/caterpillar-tube-pricing). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/09/22/caterpillar-winners-interview-1st-place-gilberto-josef-leustagos-mario/).
|
* Vlad Mironov, Alexander Guschin, 1st place of the [CERN LHCb experiment Flavour of Physics competition](https://www.kaggle.com/c/flavours-of-physics). Link to [the Kaggle interview](http://blog.kaggle.com/2015/11/30/flavour-of-physics-technical-write-up-1st-place-go-polar-bears/).
|
||||||
* XGBoost helps Qingchen Wang to win (1st place) the [Liberty Mutual Property Inspection](https://www.kaggle.com/c/liberty-mutual-group-property-inspection-prediction). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/09/28/liberty-mutual-property-inspection-winners-interview-qingchen-wang/).
|
* Josef Slavicek, 3rd place of the [CERN LHCb experiment Flavour of Physics competition](https://www.kaggle.com/c/flavours-of-physics). Link to [the Kaggle interview](http://blog.kaggle.com/2015/11/23/flavour-of-physics-winners-interview-3rd-place-josef-slavicek/).
|
||||||
* XGBoost helps Chenglong Chen to win (1st place) the [Crowdflower Search Results Relevance](https://www.kaggle.com/c/crowdflower-search-relevance). Check out the [Winning solution](https://www.kaggle.com/c/crowdflower-search-relevance/forums/t/15186/1st-place-winner-solution-chenglong-chen/).
|
* Mario Filho, Josef Feigl, Lucas, Gilberto, 1st place of the [Caterpillar Tube Pricing competition](https://www.kaggle.com/c/caterpillar-tube-pricing). Link to [the Kaggle interview](http://blog.kaggle.com/2015/09/22/caterpillar-winners-interview-1st-place-gilberto-josef-leustagos-mario/).
|
||||||
* XGBoost helps Alexandre Barachant (“Cat”) and Rafał Cycoń (“Dog”) to win (1st place) the [Grasp-and-Lift EEG Detection](https://www.kaggle.com/c/grasp-and-lift-eeg-detection). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/10/12/grasp-and-lift-eeg-winners-interview-1st-place-cat-dog/).
|
* Qingchen Wang, 1st place of the [Liberty Mutual Property Inspection](https://www.kaggle.com/c/liberty-mutual-group-property-inspection-prediction). Link to [the Kaggle interview] (http://blog.kaggle.com/2015/09/28/liberty-mutual-property-inspection-winners-interview-qingchen-wang/).
|
||||||
* XGBoost helps Halla Yang to win (2nd place) the [Recruit Coupon Purchase Prediction Challenge](https://www.kaggle.com/c/coupon-purchase-prediction). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/10/21/recruit-coupon-purchase-winners-interview-2nd-place-halla-yang/).
|
* Chenglong Chen, 1st place of the [Crowdflower Search Results Relevance](https://www.kaggle.com/c/crowdflower-search-relevance). [Link to the winning solution](https://www.kaggle.com/c/crowdflower-search-relevance/forums/t/15186/1st-place-winner-solution-chenglong-chen/).
|
||||||
* XGBoost helps Owen Zhang to win (1st place) the [Avito Context Ad Clicks competition](https://www.kaggle.com/c/avito-context-ad-clicks). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/08/26/avito-winners-interview-1st-place-owen-zhang/).
|
* Alexandre Barachant (“Cat”) and Rafał Cycoń (“Dog”), 1st place of the [Grasp-and-Lift EEG Detection](https://www.kaggle.com/c/grasp-and-lift-eeg-detection). Link to [the Kaggle interview](http://blog.kaggle.com/2015/10/12/grasp-and-lift-eeg-winners-interview-1st-place-cat-dog/).
|
||||||
* There are many other great winning solutions and interviews, but this list is [too small](https://en.wikipedia.org/wiki/Fermat%27s_Last_Theorem) to put all of them here. Please send pull requests if important ones appear.
|
* Halla Yang, 2nd place of the [Recruit Coupon Purchase Prediction Challenge](https://www.kaggle.com/c/coupon-purchase-prediction). Link to [the Kaggle interview](http://blog.kaggle.com/2015/10/21/recruit-coupon-purchase-winners-interview-2nd-place-halla-yang/).
|
||||||
|
* Owen Zhang, 1st place of the [Avito Context Ad Clicks competition](https://www.kaggle.com/c/avito-context-ad-clicks). Link to [the Kaggle interview](http://blog.kaggle.com/2015/08/26/avito-winners-interview-1st-place-owen-zhang/).
|
||||||
|
|
||||||
|
There are many other great winning solutions and interviews, but this list is [too small](https://en.wikipedia.org/wiki/Fermat%27s_Last_Theorem) to put all of them here. Please send pull requests if important ones appear.
|
||||||
|
|
||||||
|
|
||||||
## List of Tutorials
|
## Tutorials
|
||||||
|
|
||||||
* "[Open Source Tools & Data Science Competitions](http://www.slideshare.net/odsc/owen-zhangopen-sourcetoolsanddscompetitions1)" by Owen Zhang - XGBoost parameter tuning tips
|
* "[Open Source Tools & Data Science Competitions](http://www.slideshare.net/odsc/owen-zhangopen-sourcetoolsanddscompetitions1)" by Owen Zhang - XGBoost parameter tuning tips
|
||||||
* "[Tips for data science competitions](http://www.slideshare.net/OwenZhang2/tips-for-data-science-competitions)" by Owen Zhang - Page 14
|
* "[Tips for data science competitions](http://www.slideshare.net/OwenZhang2/tips-for-data-science-competitions)" by Owen Zhang - Page 14
|
||||||
@ -88,15 +109,15 @@ However, the parameter settings can be applied to all versions
|
|||||||
* "[Ensemble Decision Tree with XGBoost](https://www.kaggle.com/binghsu/predict-west-nile-virus/xgboost-starter-code-python-0-69)" by [Bing Xu](https://www.kaggle.com/binghsu)
|
* "[Ensemble Decision Tree with XGBoost](https://www.kaggle.com/binghsu/predict-west-nile-virus/xgboost-starter-code-python-0-69)" by [Bing Xu](https://www.kaggle.com/binghsu)
|
||||||
* "[Notes on eXtreme Gradient Boosting](http://startup.ml/blog/xgboost)" by ARSHAK NAVRUZYAN ([iPython Notebook](https://github.com/startupml/koan/blob/master/eXtreme%20Gradient%20Boosting.ipynb))
|
* "[Notes on eXtreme Gradient Boosting](http://startup.ml/blog/xgboost)" by ARSHAK NAVRUZYAN ([iPython Notebook](https://github.com/startupml/koan/blob/master/eXtreme%20Gradient%20Boosting.ipynb))
|
||||||
|
|
||||||
## List of Tools with XGBoost
|
## Tools with XGBoost
|
||||||
|
|
||||||
* [BayesBoost](https://github.com/mpearmain/BayesBoost) - Bayesian Optimization using xgboost and sklearn API
|
* [BayesBoost](https://github.com/mpearmain/BayesBoost) - Bayesian Optimization using xgboost and sklearn API
|
||||||
|
|
||||||
## List of Services Powered by XGBoost
|
## Services Powered by XGBoost
|
||||||
|
|
||||||
* [Seldon predictive service powered by XGBoost](http://docs.seldon.io/iris-demo.html)
|
* [Seldon predictive service powered by XGBoost](http://docs.seldon.io/iris-demo.html)
|
||||||
* [ODPS by Alibaba](https://yq.aliyun.com/articles/6355) (in Chinese)
|
* [ODPS by Alibaba](https://yq.aliyun.com/articles/6355) (in Chinese)
|
||||||
|
|
||||||
## List of Awards
|
## Awards
|
||||||
|
|
||||||
* [John Chambers Award](http://stat-computing.org/awards/jmc/winners.html) - 2016 Winner: XGBoost, by Tong He (Simon Fraser University) and Tianqi Chen (University of Washington)
|
* [John Chambers Award](http://stat-computing.org/awards/jmc/winners.html) - 2016 Winner: XGBoost, by Tong He (Simon Fraser University) and Tianqi Chen (University of Washington)
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user