diff --git a/doc/tutorials/model.rst b/doc/tutorials/model.rst
index a4c2cf98c..4720e7250 100644
--- a/doc/tutorials/model.rst
+++ b/doc/tutorials/model.rst
@@ -72,8 +72,7 @@ Decision Tree Ensembles
***********************
Now that we have introduced the elements of supervised learning, let us get started with real trees.
To begin with, let us first learn about the model choice of XGBoost: **decision tree ensembles**.
-The tree ensemble model consists of a set of classification and regression trees (CART). Here's a simple example of a CART
-that classifies whether someone will like computer games.
+The tree ensemble model consists of a set of classification and regression trees (CART). Here's a simple example of a CART that classifies whether someone will like a hypothetical computer game X.
.. image:: https://raw.githubusercontent.com/dmlc/web-data/master/xgboost/model/cart.png
:width: 100%
@@ -257,7 +256,7 @@ A left to right scan is sufficient to calculate the structure score of all possi
.. note:: Limitation of additive tree learning
- Since it is intractable to enumerate all possible tree structures, we add one split at a time. This approach works well most of the time, but there are some edge cases that fail due to this approach. For those edge cases, training results in a degenerate model because we consider only one feature dimension at a time. See `Can Gradient Boosting Learn Simple Arithmetic? `_ for an example.
+ Since it is intractable to enumerate all possible tree structures, we add one split at a time. This approach works well most of the time, but there are some edge cases that fail due to this approach. For those edge cases, training results in a degenerate model because we consider only one feature dimension at a time. See `Can Gradient Boosting Learn Simple Arithmetic? `_ for an example.
**********************
Final words on XGBoost