8.5 KiB
8.5 KiB
#Awesome XGBoost
Welcome to the wonderland of XGBoost. This page contains a curated list of awesome XGBoost examples, tutorials and blogs. It is inspired by awesom-MXnet, awesome-php and awesome-machine-learning.
Contributing
- Contribution of examples, benchmarks is more than welcome!
- If you like to share how you use xgboost to solve your problem, send a pull request:)
- If you want to contribute to this list and the examples, please open a new pull request.
##List of examples
Features Walkthrough
This is a list of short codes introducing different functionalities of xgboost packages.
- Basic walkthrough of packages python R Julia
- Customize loss function, and evaluation metric python R Julia
- Boosting from existing prediction python R Julia
- Predicting using first n trees python R Julia
- Generalized Linear Model python R Julia
- Cross validation python R Julia
- Predicting leaf indices python R
Basic Examples by Tasks
Most of examples in this section are based on CLI or python version. However, the parameter settings can be applied to all versions
Benchmarks
Machine Learning Challenge Winning Solutions
"Over the last six months, a new algorithm has come up on Kaggle winning every single competition in this category, it is an algorithm called XGBoost." -- Anthony Goldbloom, Founder & CEO of Kaggle (from his presentation "What Is Winning on Kaggle?" youtube link)
- XGBoost helps Marios Michailidis, Mathias Müller and HJ van Veen to win (1st place) the Dato Truely Native? competition. Check out the interview from Kaggle.
- XGBoost helps Vlad Mironov, Alexander Guschin to win (1st place) the CERN LHCb experiment Flavour of Physics competition. Check out the interview from Kaggle.
- XGBoost helps Josef Slavicek to win (3rd place) the CERN LHCb experiment Flavour of Physics competition. Check out the interview from Kaggle.
- XGBoost helps Mario Filho, Josef Feigl, Lucas, Gilberto to win (1st place) the Caterpillar Tube Pricing competition. Check out the interview from Kaggle.
- XGBoost helps Qingchen Wang to win (1st place) the Liberty Mutual Property Inspection. Check out the interview from Kaggle.
- XGBoost helps Chenglong Chen to win (1st place) the Crowdflower Search Results Relevance. Check out the Winning solution.
- XGBoost helps Alexandre Barachant (“Cat”) and Rafał Cycoń (“Dog”) to win (1st place) the Grasp-and-Lift EEG Detection. Check out the interview from Kaggle.
- XGBoost helps Halla Yang to win (2nd place) the Recruit Coupon Purchase Prediction Challenge. Check out the interview from Kaggle.
- XGBoost helps Owen Zhang to win (1st place) the Avito Context Ad Clicks competition. Check out the interview from Kaggle.
- There are many other great winning solutions and interviews, but this list is too small to put all of them here. Please send pull requests if important ones appear.
List of Tutorials
- "Open Source Tools & Data Science Competitions" by Owen Zhang - XGBoost parameter tuning tips
- "Tips for data science competitions" by Owen Zhang - Page 14
- "XGBoost - eXtreme Gradient Boosting" by Tong He
- "How to use XGBoost algorithm in R in easy steps" by TAVISH SRIVASTAVA (Chinese Translation 中文翻译 by HarryZhu)
- "Kaggle Solution: What’s Cooking ? (Text Mining Competition)" by MANISH SARASWAT
- "Better Optimization with Repeated Cross Validation and the XGBoost model - Machine Learning with R)" by Manuel Amunategui (Youtube Link) (Github Link)
- "XGBoost Rossman Parameter Tuning" by Norbert Kozlowski
- "Featurizing log data before XGBoost" by Xavier Conort, Owen Zhang etc
- "West Nile Virus Competition Benchmarks & Tutorials" by Anna Montoya
- "Ensemble Decision Tree with XGBoost" by Bing Xu
- "Notes on eXtreme Gradient Boosting" by ARSHAK NAVRUZYAN (iPython Notebook)
List of Tools with XGBoost
- BayesBoost - Bayesian Optimization using xgboost and sklearn API
List of Services Powered by XGBoost
List of Awards
- John Chambers Award - 2016 Winner: XGBoost, by Tong He (Simon Fraser University) and Tianqi Chen (University of Washington)