Clarification for learning_rates

This commit is contained in:
Yuan (Terry) Tang 2015-11-08 21:10:04 -06:00
parent 4db3dfee7d
commit b8bc85b534

View File

@ -50,7 +50,9 @@ def train(params, dtrain, num_boost_round=10, evals=(), obj=None, feval=None,
If `verbose_eval` then the evaluation metric on the validation set, if If `verbose_eval` then the evaluation metric on the validation set, if
given, is printed at each boosting stage. given, is printed at each boosting stage.
learning_rates: list or function learning_rates: list or function
Learning rate for each boosting round (yields learning rate decay). List of learning rate for each boosting round
or a customized function that calculates eta in terms of
current number of round and the total number of boosting round (e.g. yields learning rate decay)
- list l: eta = l[boosting round] - list l: eta = l[boosting round]
- function f: eta = f(boosting round, num_boost_round) - function f: eta = f(boosting round, num_boost_round)
xgb_model : file name of stored xgb model or 'Booster' instance xgb_model : file name of stored xgb model or 'Booster' instance