xgboost/demo/kaggle-otto
AbdealiJK 6f16f0ef58 Use bst_float consistently throughout (#1824)
* Fix various typos

* Add override to functions that are overridden

gcc gives warnings about functions that are being overridden by not
being marked as oveirridden. This fixes it.

* Use bst_float consistently

Use bst_float for all the variables that involve weight,
leaf value, gradient, hessian, gain, loss_chg, predictions,
base_margin, feature values.

In some cases, when due to additions and so on the value can
take a larger value, double is used.

This ensures that type conversions are minimal and reduces loss of
precision.
2016-11-30 10:02:10 -08:00
..
2015-04-15 18:47:31 +02:00

Benckmark for Otto Group Competition

This is a folder containing the benchmark for the Otto Group Competition on Kaggle.

Getting started

  1. Put train.csv and test.csv under the data folder
  2. Run the script
  3. Submit the submission.csv

The parameter nthread controls the number of cores to run on, please set it to suit your machine.

R-package

To install the R-package of xgboost, please run

devtools::install_github('tqchen/xgboost',subdir='R-package')

Windows users may need to install RTools first.