Johan Manders e960a09ff4 Made eval_results for sklearn output the same structure as in the new training.py
Changed the name of eval_results to evals_result, so that the naming is the same in training.py and sklearn.py

Made the structure of evals_result the same as in training.py, the names of the keys are different:

In sklearn.py you cannot name your evals_result, but they are automatically called 'validation_0', 'validation_1' etc.
The dict evals_result will output something like: {'validation_0': {'logloss': ['0.674800', '0.657121']}, 'validation_1': {'logloss': ['0.63776', '0.58372']}}

In training.py you can name your multiple evals_result with a watchlist like: watchlist  = [(dtest,'eval'), (dtrain,'train')]
The dict evals_result will output something like: {'train': {'logloss': ['0.68495', '0.67691']}, 'eval': {'logloss': ['0.684877', '0.676767']}}

You can access the evals_result using the evals_result() function.
2015-10-14 12:51:46 +02:00
2015-08-13 20:32:47 -04:00
2015-09-16 01:33:28 -07:00
2015-07-06 18:50:46 -07:00
fix
2015-04-06 09:59:18 -07:00
2015-10-02 22:38:03 +09:00
2015-08-04 19:40:30 -07:00
2015-07-29 22:41:06 -07:00
2015-10-04 13:30:01 +09:00
2015-09-14 22:12:19 +09:00
2015-07-30 22:08:48 -07:00
2015-09-18 18:45:18 -07:00
2015-07-29 23:24:54 -07:00
2015-10-08 13:22:23 -04:00
2015-09-18 10:35:41 +08:00
2015-05-21 13:01:15 -07:00
2015-09-22 07:18:15 +10:00
2015-09-08 19:45:39 -07:00

eXtreme Gradient Boosting

Build Status Documentation Status CRAN Status Badge Gitter chat for developers at https://gitter.im/dmlc/xgboost

An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version.

It implements machine learning algorithms under the Gradient Boosting framework, including Generalized Linear Model (GLM) and Gradient Boosted Decision Trees (GBDT). XGBoost can also be distributed and scale to Terascale data

XGBoost is part of Distributed Machine Learning Common <img src=https://avatars2.githubusercontent.com/u/11508361?v=3&s=20> projects

Contents

What's New

Version

  • Current version xgboost-0.4
    • Change log
    • This version is compatible with 0.3x versions

Features

  • Easily accessible through CLI, python, R, Julia
  • Its fast! Benchmark numbers comparing xgboost, H20, Spark, R - benchm-ml numbers
  • Memory efficient - Handles sparse matrices, supports external memory
  • Accurate prediction, and used extensively by data scientists and kagglers - highlight links
  • Distributed version runs on Hadoop (YARN), MPI, SGE etc., scales to billions of examples.

Bug Reporting

Contributing to XGBoost

XGBoost has been developed and used by a group of active community members. Everyone is more than welcome to contribute. It is a way to make the project better and more accessible to more users.

License

© Contributors, 2015. Licensed under an Apache-2 license.

XGBoost in Graphlab Create

  • XGBoost is adopted as part of boosted tree toolkit in Graphlab Create (GLC). Graphlab Create is a powerful python toolkit that allows you to do data manipulation, graph processing, hyper-parameter search, and visualization of TeraBytes scale data in one framework. Try the Graphlab Create
  • Nice blogpost by Jay Gu about using GLC boosted tree to solve kaggle bike sharing challenge:
Description
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Readme 33 MiB
Languages
C++ 45.5%
Python 20.3%
Cuda 15.2%
R 6.8%
Scala 6.4%
Other 5.6%