Jiaming Yuan 45aef75cca
Move skl eval_metric and early_stopping rounds to model params. (#6751)
A new parameter `custom_metric` is added to `train` and `cv` to distinguish the behaviour from the old `feval`.  And `feval` is deprecated.  The new `custom_metric` receives transformed prediction when the built-in objective is used.  This enables XGBoost to use cost functions from other libraries like scikit-learn directly without going through the definition of the link function.

`eval_metric` and `early_stopping_rounds` in sklearn interface are moved from `fit` to `__init__` and is now saved as part of the scikit-learn model.  The old ones in `fit` function are now deprecated. The new `eval_metric` in `__init__` has the same new behaviour as `custom_metric`.

Added more detailed documents for the behaviour of custom objective and metric.
2021-10-28 17:20:20 +08:00
..
2021-09-09 13:51:03 +08:00
2021-05-11 20:44:36 +08:00
2020-11-29 03:12:06 +08:00
2021-05-11 20:44:36 +08:00

Awesome XGBoost

This page contains a curated list of examples, tutorials, blogs about XGBoost usecases. It is inspired by awesome-MXNet, awesome-php and awesome-machine-learning.

Please send a pull request if you find things that belongs to here.

Contents

Code Examples

Features Walkthrough

This is a list of short codes introducing different functionalities of xgboost packages.

Basic Examples by Tasks

Most of examples in this section are based on CLI or python version. However, the parameter settings can be applied to all versions

Benchmarks

Machine Learning Challenge Winning Solutions

XGBoost is extensively used by machine learning practitioners to create state of art data science solutions, this is a list of machine learning winning solutions with XGBoost. Please send pull requests if you find ones that are missing here.

Talks

Tutorials

Usecases

If you have particular usecase of xgboost that you would like to highlight. Send a PR to add a one sentence description:)

  • XGBoost is used in Kaggle Script to solve data science challenges.
  • Distribute XGBoost as Rest API server from Jupyter notebook with BentoML. Link to notebook
  • Seldon predictive service powered by XGBoost
  • XGBoost Distributed is used in ODPS Cloud Service by Alibaba (in Chinese)
  • XGBoost is incoporated as part of Graphlab Create for scalable machine learning.
  • Hanjing Su from Tencent data platform team: "We use distributed XGBoost for click through prediction in wechat shopping and lookalikes. The problems involve hundreds millions of users and thousands of features. XGBoost is cleanly designed and can be easily integrated into our production environment, reducing our cost in developments."
  • CNevd from autohome.com ad platform team: "Distributed XGBoost is used for click through rate prediction in our display advertising, XGBoost is highly efficient and flexible and can be easily used on our distributed platform, our ctr made a great improvement with hundred millions samples and millions features due to this awesome XGBoost"

Tools using XGBoost

  • BayesBoost - Bayesian Optimization using xgboost and sklearn API
  • FLAML - An open source AutoML library designed to automatically produce accurate machine learning models with low computational cost. FLAML includes XGBoost as one of the default learners and can also be used as a fast hyperparameter tuning tool for XGBoost (code example).
  • gp_xgboost_gridsearch - In-database parallel grid-search for XGBoost on Greenplum using PL/Python
  • tpot - A Python tool that automatically creates and optimizes machine learning pipelines using genetic programming.

Integrations with 3rd party software

Open source integrations with XGBoost:

  • Neptune.ai - Experiment management and collaboration tool for ML/DL/RL specialists. Integration has a form of the XGBoost callback that automatically logs training and evaluation metrics, as well as saved model (booster), feature importance chart and visualized trees.
  • Optuna - An open source hyperparameter optimization framework to automate hyperparameter search. Optuna integrates with XGBoost in the XGBoostPruningCallback that let users easily prune unpromising trials.
  • dtreeviz - A python library for decision tree visualization and model interpretation. Starting from version 1.0, dtreeviz is able to visualize tree ensembles produced by XGBoost.

Awards

Windows Binaries

Unofficial windows binaries and instructions on how to use them are hosted on Guido Tapia's blog