Backport doc fixes that are compatible with 0.72 release

* Added python doc string for nthreads to dmatrix (#3363)
* Add instruction to compile XGBoost4J multi-threaded on OSX (#3228)
* Params confusion fixed: num_round, scalePosWeight (#3386)
* Added detailed instruction for adding XGBoost4J as Maven dependency (#3374)
This commit is contained in:
ngoyal2707
2018-06-07 19:16:30 -07:00
committed by Philip Cho
parent 23bc3fc4aa
commit d0f45bede0
5 changed files with 74 additions and 9 deletions

View File

@@ -57,6 +57,8 @@ the add dependency as following:
"ml.dmlc" % "xgboost4j" % "latest_version_num"
```
For the latest release version number, please check [here](https://github.com/dmlc/xgboost/releases).
if you want to use `xgboost4j-spark`, you just need to replace xgboost4j with `xgboost4j-spark`
## Examples
@@ -70,4 +72,4 @@ be found in the [examples package](https://github.com/dmlc/xgboost/tree/master/j
* Spark does the internal conversion, and does not accept formats that are 0-based
* Whereas, use *0-based* indexes format when predicting in normal mode - for instance, while using the saved model in the Python package
* Whereas, use *0-based* indexes format when predicting in normal mode - for instance, while using the saved model in the Python package

View File

@@ -154,7 +154,7 @@ trait BoosterParams extends Params {
/**
* Control the balance of positive and negative weights, useful for unbalanced classes. A typical
* value to consider: sum(negative cases) / sum(positive cases). [default=0]
* value to consider: sum(negative cases) / sum(positive cases). [default=1]
*/
val scalePosWeight = new DoubleParam(this, "scale_pos_weight", "Control the balance of positive" +
" and negative weights, useful for unbalanced classes. A typical value to consider:" +

View File

@@ -90,7 +90,7 @@ class XGBoostSparkPipelinePersistence extends FunSuite with PerTest
.setInputCols(df.columns
.filter(!_.contains("label")))
.setOutputCol("features")
val xgbEstimator = new XGBoostEstimator(Map("num_rounds" -> 10,
val xgbEstimator = new XGBoostEstimator(Map("num_round" -> 10,
"tracker_conf" -> TrackerConf(60 * 60 * 1000, "scala")
)).setFeaturesCol("features").setLabelCol("label")
// separate