Cleanup configuration for constraints. (#7758)

This commit is contained in:
Jiaming Yuan
2022-03-29 04:22:46 +08:00
committed by GitHub
parent 3c9b04460a
commit a50b84244e
5 changed files with 53 additions and 42 deletions

View File

@@ -134,7 +134,7 @@ Following table summarizes some differences in supported features between 4 tree
+------------------+-----------+---------------------+---------------------+------------------------+
| categorical data | F | T | T | T |
+------------------+-----------+---------------------+---------------------+------------------------+
| External memory | F | T | P | P |
| External memory | F | T | T | P |
+------------------+-----------+---------------------+---------------------+------------------------+
| Distributed | F | T | T | T |
+------------------+-----------+---------------------+---------------------+------------------------+

View File

@@ -174,6 +174,14 @@ parameter:
num_boost_round = 1000, evals = evallist,
early_stopping_rounds = 10)
**************************
Using feature name instead
**************************
XGBoost's Python package supports using feature names instead of feature index for
specifying the constraints. Given a data frame with columns ``["f0", "f1", "f2"]``, the
feature interaction constraint can be specified as ``[["f0", "f2"]]``.
**************
Advanced topic
**************

View File

@@ -69,7 +69,7 @@ Then fitting with monotonicity constraints only requires adding a single paramet
.. code-block:: python
params_constrained = params.copy()
params_constrained['monotone_constraints'] = "(1,-1)"
params_constrained['monotone_constraints'] = (1,-1)
model_with_constraints = xgb.train(params_constrained, dtrain,
num_boost_round = 1000, evals = evallist,
@@ -90,3 +90,13 @@ monotonic constraints may produce unnecessarily shallow trees. This is because t
split. Monotonic constraints may wipe out all available split candidates, in which case no
split is made. To reduce the effect, you may want to increase the ``max_bin`` parameter to
consider more split candidates.
*******************
Using feature names
*******************
XGBoost's Python package supports using feature names instead of feature index for
specifying the constraints. Given a data frame with columns ``["f0", "f1", "f2"]``, the
monotonic constraint can be specified as ``{"f0": 1, "f2": -1}``, and ``"f1"`` will
default to ``0`` (no constraint).