[LOG] Simplfy README.md add change logs.
This commit is contained in:
parent
2dc6c2dc52
commit
263b7befde
25
CHANGES.md
25
CHANGES.md
@ -35,6 +35,7 @@ xgboost-0.4
|
||||
* sklearn wrapper is supported in python module
|
||||
* Experimental External memory version
|
||||
|
||||
|
||||
xgboost-0.47
|
||||
------------
|
||||
* Changes in R library
|
||||
@ -52,10 +53,28 @@ xgboost-0.47
|
||||
- improved compatibility in sklearn module.
|
||||
- additional parameters added for sklearn wrapper.
|
||||
- added pip installation functionality.
|
||||
- supports more Pandas DataFrame dtypes.
|
||||
- supports more Pandas DataFrame dtypes.
|
||||
- added best_ntree_limit attribute, in addition to best_score and best_iteration.
|
||||
* Java api is ready for use
|
||||
* Added more test cases and continuous integration to make each build more robust.
|
||||
|
||||
on going at master
|
||||
------------------
|
||||
xgboost brick: next release candidate
|
||||
-------------------------------------
|
||||
* Major refactor of core library.
|
||||
- Goal: more flexible and modular code as a portable library.
|
||||
- Switch to use of c++11 standard code.
|
||||
- Random number generator defaults to ```std::mt19937```.
|
||||
- Share the data loading pipeline and logging module from dmlc-core.
|
||||
- Enable registry pattern to allow optionally plugin of objective, metric, tree constructor, data loader.
|
||||
- Future plugin modules can be put into xgboost/plugin and register back to the library.
|
||||
- Remove most of the raw pointers to smart ptrs, for RAII safety.
|
||||
* Change library name to libxgboost.so
|
||||
* Backward compatiblity
|
||||
- The binary buffer file is not backward compatible with previous version.
|
||||
- The model file is backward compatible on 64 bit platforms.
|
||||
* The model file is compatible between 64/32 bit platforms(not yet tested).
|
||||
* External memory version and other advanced features will be exposed to R library as well on linux.
|
||||
- Previously some of the features are blocked due to C++11 and threading limits.
|
||||
- The windows version is still blocked due to Rtools do not support ```std::thread```.
|
||||
* rabit and dmlc-core are maintained through git submodule
|
||||
- Anyone can open PR to update these dependencies now.
|
||||
|
||||
39
README.md
39
README.md
@ -7,47 +7,31 @@
|
||||
[](https://pypi.python.org/pypi/xgboost/)
|
||||
[](https://gitter.im/dmlc/xgboost?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
|
||||
|
||||
An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version.
|
||||
|
||||
It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework, including [Generalized Linear Model](https://en.wikipedia.org/wiki/Generalized_linear_model) (GLM) and [Gradient Boosted Decision Trees](https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting) (GBDT). XGBoost can also be [distributed](#features) and scale to Terascale data
|
||||
|
||||
XGBoost is part of [Distributed Machine Learning Common](http://dmlc.github.io/) projects
|
||||
XGBoost is an optimized distributed gradient boosting library designed to be highly *efficient*, *flexible* and *portable*.
|
||||
It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework.
|
||||
XGBoost provides a parallel tree boosting(also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
|
||||
The same code runs on major distributed environment(Hadoop, SGE, MPI) and can solve problems beyond billions of examples.
|
||||
XGBoost is part of [DMLC](http://dmlc.github.io/) projects.
|
||||
|
||||
Contents
|
||||
--------
|
||||
* [What's New](#whats-new)
|
||||
* [Version](#version)
|
||||
* [Documentation](doc/index.md)
|
||||
* [Build Instruction](doc/build.md)
|
||||
* [Features](#features)
|
||||
* [Distributed XGBoost](multi-node)
|
||||
* [Documentation](https://xgboost.readthedocs.org)
|
||||
* [Usecases](doc/index.md#highlight-links)
|
||||
* [Bug Reporting](#bug-reporting)
|
||||
* [Contributing to XGBoost](#contributing-to-xgboost)
|
||||
* [Code Examples](demo)
|
||||
* [Build Instruction](doc/build.md)
|
||||
* [Committers and Contributors](CONTRIBUTORS.md)
|
||||
* [License](#license)
|
||||
* [XGBoost in Graphlab Create](#xgboost-in-graphlab-create)
|
||||
|
||||
What's New
|
||||
----------
|
||||
|
||||
* XGBoost [brick](CHANGES.md)
|
||||
* XGBoost helps Vlad Mironov, Alexander Guschin to win the [CERN LHCb experiment Flavour of Physics competition](https://www.kaggle.com/c/flavours-of-physics). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/11/30/flavour-of-physics-technical-write-up-1st-place-go-polar-bears/).
|
||||
* XGBoost helps Mario Filho, Josef Feigl, Lucas, Gilberto to win the [Caterpillar Tube Pricing competition](https://www.kaggle.com/c/caterpillar-tube-pricing). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/09/22/caterpillar-winners-interview-1st-place-gilberto-josef-leustagos-mario/).
|
||||
* XGBoost helps Halla Yang to win the [Recruit Coupon Purchase Prediction Challenge](https://www.kaggle.com/c/coupon-purchase-prediction). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/10/21/recruit-coupon-purchase-winners-interview-2nd-place-halla-yang/).
|
||||
* XGBoost helps Owen Zhang to win the [Avito Context Ad Click competition](https://www.kaggle.com/c/avito-context-ad-clicks). Check out the [interview from Kaggle](http://blog.kaggle.com/2015/08/26/avito-winners-interview-1st-place-owen-zhang/).
|
||||
* XGBoost helps Chenglong Chen to win [Kaggle CrowdFlower Competition](https://www.kaggle.com/c/crowdflower-search-relevance)
|
||||
Check out the [winning solution](https://github.com/ChenglongChen/Kaggle_CrowdFlower)
|
||||
* XGBoost-0.4 release, see [CHANGES.md](CHANGES.md#xgboost-04)
|
||||
* XGBoost helps three champion teams to win [WWW2015 Microsoft Malware Classification Challenge (BIG 2015)](http://www.kaggle.com/c/malware-classification/forums/t/13490/say-no-to-overfitting-approaches-sharing)
|
||||
Check out the [winning solution](doc/README.md#highlight-links)
|
||||
* [External Memory Version](doc/external_memory.md)
|
||||
|
||||
Version
|
||||
-------
|
||||
|
||||
* Current version xgboost-0.4
|
||||
- [Change log](CHANGES.md)
|
||||
- This version is compatible with 0.3x versions
|
||||
* Current version xgboost-0.6 (brick)
|
||||
- See [Change log](CHANGES.md) for details
|
||||
|
||||
Features
|
||||
--------
|
||||
@ -76,4 +60,3 @@ XGBoost has been developed and used by a group of active community members. Ever
|
||||
License
|
||||
-------
|
||||
© Contributors, 2015. Licensed under an [Apache-2](https://github.com/dmlc/xgboost/blob/master/LICENSE) license.
|
||||
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user