2014-09-01 01:10:29 +02:00
2014-08-26 18:06:22 -07:00
2014-08-27 09:59:39 -07:00
2014-08-17 22:49:36 -07:00
2014-08-30 10:26:41 +02:00
2014-08-27 09:59:39 -07:00
ok
2014-08-23 18:57:19 -07:00
2014-05-15 20:28:34 -07:00
2014-08-25 15:58:52 -07:00
2014-08-31 16:28:49 +02:00

This is a Fork of XGBoost from https://github.com/tqchen/xgboost

In the main repo you already find 2 windows projects for the porting of the executable and the python library.

Here you have:

  1. a c# dll wrapper, meaning the passage from unmanaged to managed code, in https://github.com/giuliohome/xgboost/tree/master/windows/xgboost_sharp_wrapper

  2. the c# Higgs Kaggle demo, instead of the python one (actually you will get a higher score with the c# version, due to some changes I've made) in https://github.com/giuliohome/xgboost/tree/master/windows/kaggle_higgs_demo Start the demo from the root folder like this: bin\x64\Debug\kaggle_higgs_demo.exe training_path.csv test_path.csv sharp_pred.csv

  3. 5 fold cv implementation in c# for the demo: you see inline cv ams while training (of course on a completely separate set)

Description
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Readme 33 MiB
Languages
C++ 45.5%
Python 20.3%
Cuda 15.2%
R 6.8%
Scala 6.4%
Other 5.6%