diff --git a/README.md b/README.md index 85d89e691..3d23ce94a 100644 --- a/README.md +++ b/README.md @@ -15,19 +15,16 @@ Distributed Version: [Distributed XGBoost](multi-node) Notes on the Code: [Code Guide](src) -Turorial and Documentation: https://github.com/dmlc/xgboost/wiki - -Video tutorial: [Better Optimization with Repeated Cross Validation and the XGBoost model - Machine Learning with R](https://www.youtube.com/watch?v=Og7CGAfSr_Y) +Documentation: https://github.com/dmlc/xgboost/doc Learning about the model: [Introduction to Boosted Trees](http://homes.cs.washington.edu/~tqchen/pdf/BoostedTree.pdf) * This slide is made by Tianqi Chen to introduce gradient boosting in a statistical view. * It present boosted tree learning as formal functional space optimization of defined objective. * The model presented is used by xgboost for boosted trees -Presention of a real use case of XGBoost to prepare tax audit in France: [Feature Importance Analysis with XGBoost in Tax audit](http://fr.slideshare.net/MichaelBENESTY/feature-importance-analysis-with-xgboost-in-tax-audit) - What's New ========== +* [External Memory Version](doc/external_memory.md) * XGBoost wins [WWW2015 Microsoft Malware Classification Challenge (BIG 2015)](http://www.kaggle.com/c/malware-classification/forums/t/13490/say-no-to-overfitting-approaches-sharing) * XGBoost now support HDFS and S3 * [Distributed XGBoost now runs on YARN](https://github.com/dmlc/wormhole/tree/master/learn/xgboost)! diff --git a/doc/README.md b/doc/README.md index 801e7e65a..18d15ba9c 100644 --- a/doc/README.md +++ b/doc/README.md @@ -1,6 +1,6 @@ XGBoost Documentation ==== -This is an ongoing effort to move the [wiki document](https://github.com/dmlc/xgboost/wiki) to here. +This is an ongoing effort to move the [wiki document](https://github.com/dmlc/xgboost/wiki) to here. You can already find all the most useful parts here. List of Documentations ==== @@ -13,7 +13,9 @@ Highlights Links This section is about blogposts, presentation and videos discussing how to use xgboost to solve your interesting problem. If you think something belongs to here, send a pull request. * Blogpost by phunther: [Winning solution of Kaggle Higgs competition: what a single model can do](http://no2147483647.wordpress.com/2014/09/17/winning-solution-of-kaggle-higgs-competition-what-a-single-model-can-do/) * [Kaggle Tradeshift winning solution by daxiongshu](https://github.com/daxiongshu/kaggle-tradeshift-winning-solution) +* Video tutorial: [Better Optimization with Repeated Cross Validation and the XGBoost model - Machine Learning with R](https://www.youtube.com/watch?v=Og7CGAfSr_Y) +* Presention of a real use case of XGBoost to prepare tax audit in France: [Feature Importance Analysis with XGBoost in Tax audit](http://fr.slideshare.net/MichaelBENESTY/feature-importance-analysis-with-xgboost-in-tax-audit) Contribution ==== -Contribution of document usecases are welcomed! +Contribution of document and usecases are welcomed! diff --git a/doc/external_memory.md b/doc/external_memory.md index e98133467..0a5d66ac4 100644 --- a/doc/external_memory.md +++ b/doc/external_memory.md @@ -1,4 +1,4 @@ -Using XGBoost External Memory Version +Using XGBoost External Memory Version(beta) ==== There is no big difference between using external memory version and in-memory version. The only difference is the filename format.