38 Commits

Author SHA1 Message Date
AbdealiJK
6f16f0ef58 Use bst_float consistently throughout (#1824)
* Fix various typos

* Add override to functions that are overridden

gcc gives warnings about functions that are being overridden by not
being marked as oveirridden. This fixes it.

* Use bst_float consistently

Use bst_float for all the variables that involve weight,
leaf value, gradient, hessian, gain, loss_chg, predictions,
base_margin, feature values.

In some cases, when due to additions and so on the value can
take a larger value, double is used.

This ensures that type conversions are minimal and reduces loss of
precision.
2016-11-30 10:02:10 -08:00
AbdealiJK
b94fcab4dc Add dump_format=json option (#1726)
* Add format to the params accepted by DumpModel

Currently, only the test format is supported when trying to dump
a model. The plan is to add more such formats like JSON which are
easy to read and/or parse by machines. And to make the interface
for this even more generic to allow other formats to be added.

Hence, we make some modifications to make these function generic
and accept a new parameter "format" which signifies the format of
the dump to be created.

* Fix typos and errors in docs

* plugin: Mention all the register macros available

Document the register macros currently available to the plugin
writers so they know what exactly can be extended using hooks.

* sparce_page_source: Use same arg name in .h and .cc

* gbm: Add JSON dump

The dump_format argument can be used to specify what type
of dump file should be created. Add functionality to dump
gblinear and gbtree into a JSON file.

The JSON file has an array, each item is a JSON object for the tree.
For gblinear:
 - The item is the bias and weights vectors
For gbtree:
 - The item is the root node. The root node has a attribute "children"
   which holds the children nodes. This happens recursively.

* core.py: Add arg dump_format for get_dump()
2016-11-04 09:55:25 -07:00
AbdealiJK
378eb7d7c8 Fix typos and messages in docs (#1723) 2016-10-30 22:52:19 -07:00
Adam Pocock
445029bb82 [jvm-packages] XGBoost4j Windows fixes (#1639)
* Changes for Mingw64 compilation to ensure long is a consistent size.

Mainly impacts the Java API which would not compile, but there may be
silent errors on Windows with large datasets before this patch (as long
is 32-bits when compiled with mingw64 even in 64-bit mode).

* Adding ifdefs to ensure it still compiles on MacOS

* Makefile and create_jni.bat changes for Windows.

* Switching XGDMatrixCreateFromCSREx JNI call to use size_t cast

* Fixing lint error, adding profile switching to jvm-packages build to make create-jni.bat get called, adding myself to Contributors.Md
2016-10-18 08:35:25 -04:00
Vadim Khotilovich
693ddb860e More robust DMatrix creation from a sparse matrix (#1606)
* [CORE] DMatrix from sparse w/ explicit #col #row; safer arg types

* [python-package] c-api change for _init_from_csr _init_from_csc

* fix spaces

* [R-package] adopt the new XGDMatrixCreateFromCSCEx interface

* [CORE] redirect old sparse creators to new ones
2016-09-25 10:01:22 -07:00
Tianqi Chen
ecec5f7959 [CORE] Refactor cache mechanism (#1540) 2016-09-02 20:39:07 -07:00
Tianqi Chen
df38f251be Fix warnings from g++5 or higher (#1510) 2016-08-26 16:14:10 -07:00
RAMitchell
93196eb811 cmake build system (#1314)
* Changed c api to compile under MSVC

* Include functional.h header for MSVC

* Add cmake build
2016-07-02 19:07:35 -07:00
Vadim Khotilovich
26b36714ea doxygen suggested fix 2016-05-15 03:05:19 -05:00
Vadim Khotilovich
ea9285dd4f methods to delete an attribute and get names of available attributes 2016-05-14 18:19:18 -05:00
Wojciech Migda
6a5eb47789 XGBoosterCreate api unified to use const DMatrix[] argument 2016-03-26 19:42:58 +01:00
tqchen
86871d4be9 [JVM] Add Iterator loading API 2016-03-04 17:37:46 -08:00
tqchen
ecb3a271be [PYTHON-DIST] Distributed xgboost python training API. 2016-02-29 16:54:13 -08:00
tqchen
4a16b729fc [PYTHON] Simplify training logic, update rabit lib 2016-02-28 13:20:55 -08:00
tqchen
2f2080a337 [TREE] Remove gap constraint, make tree construction more robust 2016-02-10 11:17:54 -08:00
tqchen
88447ca32e [MEM] Add rowset struct to save memory with billion level rows 2016-02-10 11:17:17 -08:00
tqchen
1495a43cea [R] make all customizations to meet strict standard of cran 2016-01-16 10:25:12 -08:00
tqchen
634db18a0f [TRAVIS] cleanup travis script 2016-01-16 10:25:12 -08:00
tqchen
ef1021e759 [IO] Enable external memory 2016-01-16 10:24:01 -08:00
tqchen
d75e3ed05d [LIBXGBOOST] pass demo running. 2016-01-16 10:24:01 -08:00
tqchen
cee148ed64 [CLI] initial refactor of CLI 2016-01-16 10:24:01 -08:00
tqchen
0d95e863c9 [LEARNER] refactor learner 2016-01-16 10:24:01 -08:00
tqchen
4b4b36d047 [GBM] remove need to explicit InitModel, rename save/load 2016-01-16 10:24:01 -08:00
tqchen
82ceb4de0a [LEARNER] Init learner interface 2016-01-16 10:24:01 -08:00
tqchen
9042b9e2c7 [GBM] Finish migrate all gbms 2016-01-16 10:24:01 -08:00
tqchen
e4567bbc47 [REFACTOR] Add alias, allow missing variables, init gbm interface 2016-01-16 10:24:01 -08:00
tqchen
4f26d98150 [Update] remove rabit subtree, use submodule, move code 2016-01-16 10:24:01 -08:00
tqchen
d4677b6561 [TREE] finish move of updater 2016-01-16 10:24:01 -08:00
tqchen
20043f63a6 [TREE] Move colmaker 2016-01-16 10:24:01 -08:00
tqchen
c8ccb61b9e [TREE] Enable updater registry 2016-01-16 10:24:01 -08:00
tqchen
a62a66d545 [TREE] Finalize regression tree refactor 2016-01-16 10:24:01 -08:00
tqchen
844e8a153d [TREE] Refactor to new logging 2016-01-16 10:24:01 -08:00
tqchen
05115adbff [TREE] move tree model 2016-01-16 10:24:01 -08:00
tqchen
b4d0bb5a6d [METRIC] all metric move finished 2016-01-16 10:24:01 -08:00
tqchen
dedd87662b [OBJ] Add basic objective function and registry 2016-01-16 10:24:01 -08:00
tqchen
46bcba7173 [DATA] basic data refactor done, basic version of csr source. 2016-01-16 10:24:00 -08:00
tqchen
7ff91fe5f9 Data interface ready 2016-01-16 10:24:00 -08:00
tqchen
d530e0c14f [REFACTOR] cleanup structure 2016-01-16 10:24:00 -08:00