[R-package] various fixes for R CMD check (#1328)

* [R] fix xgb.create.features

* [R] fixes for R CMD check
This commit is contained in:
Vadim Khotilovich
2016-07-04 12:40:35 -05:00
committed by Tianqi Chen
parent f8d23b97be
commit 11efa038bd
22 changed files with 49 additions and 39 deletions

View File

@@ -7,7 +7,7 @@
get.paths.to.leaf(dt_tree)
}
\arguments{
\item{dt.tree}{data.table containing the nodes and edges of the trees}
\item{dt_tree}{data.table containing the nodes and edges of the trees}
}
\description{
Extract path from root to leaf from data.table

View File

@@ -7,7 +7,7 @@
\usage{
getinfo(object, ...)
\method{getinfo}{xgb.DMatrix}(object, name)
\method{getinfo}{xgb.DMatrix}(object, name, ...)
}
\arguments{
\item{object}{Object of class \code{xgb.DMatrix}}

View File

@@ -7,6 +7,8 @@
multiplot(..., cols = 1)
}
\arguments{
\item{...}{the plots}
\item{cols}{number of columns}
}
\description{

View File

@@ -7,7 +7,7 @@
\usage{
\method{predict}{xgb.Booster}(object, newdata, missing = NA,
outputmargin = FALSE, ntreelimit = NULL, predleaf = FALSE,
reshape = FALSE)
reshape = FALSE, ...)
\method{predict}{xgb.Booster.handle}(object, ...)
}

View File

@@ -4,7 +4,7 @@
\alias{print.xgb.Booster}
\title{Print xgb.Booster}
\usage{
print.xgb.Booster(x, verbose = FALSE, ...)
\method{print}{xgb.Booster}(x, verbose = FALSE, ...)
}
\arguments{
\item{x}{an xgb.Booster object}

View File

@@ -4,7 +4,7 @@
\alias{print.xgb.DMatrix}
\title{Print xgb.DMatrix}
\usage{
print.xgb.DMatrix(x, verbose = FALSE, ...)
\method{print}{xgb.DMatrix}(x, verbose = FALSE, ...)
}
\arguments{
\item{x}{an xgb.DMatrix object}
@@ -24,5 +24,6 @@ dtrain <- xgb.DMatrix(train$data, label=train$label)
dtrain
print(dtrain, verbose=TRUE)
}

View File

@@ -4,7 +4,7 @@
\alias{print.xgb.cv.synchronous}
\title{Print xgb.cv result}
\usage{
print.xgb.cv.synchronous(x, verbose = FALSE, ...)
\method{print}{xgb.cv.synchronous}(x, verbose = FALSE, ...)
}
\arguments{
\item{x}{an \code{xgb.cv.synchronous} object}

View File

@@ -7,7 +7,7 @@
\usage{
setinfo(object, ...)
\method{setinfo}{xgb.DMatrix}(object, name, info)
\method{setinfo}{xgb.DMatrix}(object, name, info, ...)
}
\arguments{
\item{object}{Object of class "xgb.DMatrix"}

View File

@@ -48,7 +48,7 @@ would not be saved by \code{xgb.save} because an xgboost model is an external me
and its serialization is handled extrnally.
Also, setting an attribute that has the same name as one of xgboost's parameters wouldn't
change the value of that parameter for a model.
Use \code{\link{`xgb.parameters<-`}} to set or change model parameters.
Use \code{\link{xgb.parameters<-}} to set or change model parameters.
The attribute setters would usually work more efficiently for \code{xgb.Booster.handle}
than for \code{xgb.Booster}, since only just a handle (pointer) would need to be copied.

View File

@@ -25,7 +25,7 @@ This is the function inspired from the paragraph 3.1 of the paper:
\strong{Practical Lessons from Predicting Clicks on Ads at Facebook}
\emph{(Xinran He, Junfeng Pan, Ou Jin, Tianbing Xu, Bo Liu, Tao Xu, Yan, xin Shi, Antoine Atallah, Ralf Herbrich, Stuart Bowers,
Joaquin Quiñonero Candela)}
Joaquin Quinonero Candela)}
International Workshop on Data Mining for Online Advertising (ADKDD) - August 24, 2014
@@ -33,7 +33,7 @@ International Workshop on Data Mining for Online Advertising (ADKDD) - August 24
Extract explaining the method:
"\emph{We found that boosted decision trees are a powerful and very
"We found that boosted decision trees are a powerful and very
convenient way to implement non-linear and tuple transformations
of the kind we just described. We treat each individual
tree as a categorical feature that takes as value the
@@ -54,7 +54,7 @@ We can understand boosted decision tree
based transformation as a supervised feature encoding that
converts a real-valued vector into a compact binary-valued
vector. A traversal from root node to a leaf node represents
a rule on certain features.}"
a rule on certain features."
}
\examples{
data(agaricus.train, package='xgboost')