DOC: Updated contributors.md

This commit is contained in:
terrytangyuan 2015-10-04 23:26:46 -05:00
parent 5dd23a2195
commit 9d627e2567
2 changed files with 13 additions and 7 deletions

View File

@ -33,8 +33,9 @@ List of Contributors
- Skipper is the major contributor to the scikit-learn module of xgboost.
* [Zygmunt Zając](https://github.com/zygmuntz)
- Zygmunt is the master behind the early stopping feature frequently used by kagglers.
* [Ajinkya Kale](https://github.com/ajkl)
* [Yuan Tang](https://github.com/terrytangyuan)
- Yuan is the major contributor to unit tests in R and Python.
* [Ajinkya Kale](https://github.com/ajkl)
* [Boliang Chen](https://github.com/cblsjtu)
* [Vadim Khotilovich](https://github.com/khotilov)
* [Yangqing Men](https://github.com/yanqingmen)

View File

@ -1,9 +1,14 @@
import xgboost as xgb
from sklearn.datasets import load_digits
from sklearn.cross_validation import KFold, train_test_split
def test_early_stopping_nonparallel():
digits = load_digits(2)
X = digits['data']
y = digits['target']
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
clf = xgb.XGBClassifier()
clf.fit(X_train, y_train, early_stopping_rounds=10, eval_metric="auc",
eval_set=[(X_test, y_test)])
X = digits['data']
y = digits['target']
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
clf = xgb.XGBClassifier()
clf.fit(X_train, y_train, early_stopping_rounds=10, eval_metric="auc",
eval_set=[(X_test, y_test)])
# todo: parallel test for early stopping