From 631b092b250454163e12fec802bd944ffb54a478 Mon Sep 17 00:00:00 2001 From: tqchen Date: Sun, 18 Jan 2015 22:56:29 -0800 Subject: [PATCH] changes --- README.md | 1 + multi-node/README.md | 3 ++- 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 6c7a538e7..8f031c9eb 100644 --- a/README.md +++ b/README.md @@ -19,6 +19,7 @@ Learning about the model: [Introduction to Boosted Trees](http://homes.cs.washin What's New ===== +* [Distributed XGBoost](multi-node) is now available to scale to even larger scale problems * XGBoost wins [Tradeshift Text Classification](https://kaggle2.blob.core.windows.net/forum-message-attachments/60041/1813/TradeshiftTextClassification.pdf?sv=2012-02-12&se=2015-01-02T13%3A55%3A16Z&sr=b&sp=r&sig=5MHvyjCLESLexYcvbSRFumGQXCS7MVmfdBIY3y01tMk%3D) * XGBoost wins [HEP meets ML Award in Higgs Boson Challenge](http://atlas.ch/news/2014/machine-learning-wins-the-higgs-challenge.html) * Thanks to Bing Xu, [XGBoost.jl](https://github.com/antinucleon/XGBoost.jl) allows you to use xgboost from Julia diff --git a/multi-node/README.md b/multi-node/README.md index ce37daeab..752292d9a 100644 --- a/multi-node/README.md +++ b/multi-node/README.md @@ -10,7 +10,8 @@ This folder contains information of Distributed XGBoost. Build ===== -* In the root folder, run ```./build.sh```, this will give you xgboost, which uses rabit allreduce +* In the root folder, type ```make``` + - If you have C++11 compiler, it is recommended to use ```make cxx11=1``` Notes ====