From ba63b2886f32894c15caab87012f0862fbdd9242 Mon Sep 17 00:00:00 2001 From: Will Stanton Date: Wed, 22 Jul 2015 10:37:49 -0600 Subject: [PATCH] Check out vs. checkout Made it consistent across the README --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 21d15ce56..97e348b43 100644 --- a/README.md +++ b/README.md @@ -6,7 +6,7 @@ DMLC/XGBoost An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version. It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework, including [Generalized Linear Model](https://en.wikipedia.org/wiki/Generalized_linear_model) (GLM) and [Gradient Boosted Decision Trees](https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting) (GBDT). XGBoost can also be [distributed](#features) and scale to Terascale data -Checkout our [Committers and Contributors](CONTRIBUTORS.md) who help make xgboost better. +Check out our [Committers and Contributors](CONTRIBUTORS.md) who help make xgboost better. Documentation: [Documentation of dmlc/xgboost](doc/README.md) @@ -27,7 +27,7 @@ XGBoost is part of [Distributed Machine Learning Common](http://dmlc.github.io/) What's New ========== * XGBoost helps Chenglong Chen to win [Kaggle CrowdFlower Competition](https://www.kaggle.com/c/crowdflower-search-relevance) - - Checkout the winning solution at [Highlight links](doc/README.md#highlight-links) + - Check out the winning solution at [Highlight links](doc/README.md#highlight-links) * XGBoost-0.4 release, see [CHANGES.md](CHANGES.md#xgboost-04) * XGBoost helps three champion teams to win [WWW2015 Microsoft Malware Classification Challenge (BIG 2015)](http://www.kaggle.com/c/malware-classification/forums/t/13490/say-no-to-overfitting-approaches-sharing) - Check out the winning solution at [Highlight links](doc/README.md#highlight-links)