From 85dbaf638bfbb75c023203893cd851920f948cd9 Mon Sep 17 00:00:00 2001 From: Tianqi Chen Date: Tue, 2 Sep 2014 23:33:04 -0700 Subject: [PATCH] Update xgboost.Rnw --- R-package/vignettes/xgboost.Rnw | 13 +++++-------- 1 file changed, 5 insertions(+), 8 deletions(-) diff --git a/R-package/vignettes/xgboost.Rnw b/R-package/vignettes/xgboost.Rnw index 19254abaf..9ecceca17 100644 --- a/R-package/vignettes/xgboost.Rnw +++ b/R-package/vignettes/xgboost.Rnw @@ -52,8 +52,7 @@ This is an introductory document of using the \verb@xgboost@ package in R. and scalable implementation of gradient boosting framework by \citep{friedman2001greedy}. The package includes efficient linear model solver and tree learning algorithm. It supports various objective functions, including regression, classification -and ranking. The package is made to be extendible, so that user are also allowed -to define there own objectives easily. It has several features: +and ranking. The package is made to be extendible, so that users are also allowed to define their own objectives easily. It has several features: \begin{enumerate} \item{Speed: }{\verb@xgboost@ can automatically do parallel computation on Windows and Linux, with openmp. It is generally over 10 times faster than @@ -137,13 +136,10 @@ diris = xgb.DMatrix('iris.xgb.DMatrix') \section{Advanced Examples} -The function \verb@xgboost@ is a simple function with less parameters, in order -to be R-friendly. The core training function is wrapped in \verb@xgb.train@. It -is more flexible than \verb@xgboost@, but it requires users to read the document -a bit more carefully. +The function \verb@xgboost@ is a simple function with less parameter, in order +to be R-friendly. The core training function is wrapped in \verb@xgb.train@. It is more flexible than \verb@xgboost@, but it requires users to read the document a bit more carefully. -\verb@xgb.train@ only accept a \verb@xgb.DMatrix@ object as its input, while it -supports advanced features as custom objective and evaluation functions. +\verb@xgb.train@ only accept a \verb@xgb.DMatrix@ object as its input, while it supports advanced features as custom objective and evaluation functions. <>= logregobj <- function(preds, dtrain) { @@ -213,3 +209,4 @@ competition. \bibliography{xgboost} \end{document} +