Feature weights (#5962)

This commit is contained in:
Jiaming Yuan
2020-08-18 19:55:41 +08:00
committed by GitHub
parent a418278064
commit 4d99c58a5f
25 changed files with 509 additions and 104 deletions

View File

@@ -107,6 +107,10 @@ Parameters for Tree Booster
'colsample_bynode':0.5}`` with 64 features will leave 8 features to choose from at
each split.
On Python interface, one can set the ``feature_weights`` for DMatrix to define the
probability of each feature being selected when using column sampling. There's a
similar parameter for ``fit`` method in sklearn interface.
* ``lambda`` [default=1, alias: ``reg_lambda``]
- L2 regularization term on weights. Increasing this value will make model more conservative.
@@ -224,7 +228,7 @@ Parameters for Tree Booster
list is a group of indices of features that are allowed to interact with each other.
See tutorial for more information
Additional parameters for ``hist`` and ```gpu_hist`` tree method
Additional parameters for ``hist`` and ``gpu_hist`` tree method
================================================================
* ``single_precision_histogram``, [default=``false``]