Fix typos, addressing issues #2212 and #3090 (#3105)

This commit is contained in:
Philip Hyunsu Cho
2018-02-09 11:16:44 -08:00
committed by GitHub
parent 81d1b17f9c
commit 375d75304d
2 changed files with 2 additions and 2 deletions

View File

@@ -3,7 +3,7 @@ DART booster
[XGBoost](https://github.com/dmlc/xgboost)) mostly combines a huge number of regression trees with a small learning rate.
In this situation, trees added early are significant and trees added late are unimportant.
Rasmi et al. proposed a new method to add dropout techniques from the deep neural net community to boosted trees, and reported better results in some situations.
Vinayak and Gilad-Bachrach proposed a new method to add dropout techniques from the deep neural net community to boosted trees, and reported better results in some situations.
This is a instruction of new tree booster `dart`.