% Generated by roxygen2 (4.1.0): do not edit by hand % Please edit documentation in R/xgboost.R \name{xgboost} \alias{xgboost} \title{eXtreme Gradient Boosting (Tree) library} \usage{ xgboost(data = NULL, label = NULL, missing = NULL, params = list(), nrounds, verbose = 1, ...) } \arguments{ \item{data}{takes \code{matrix}, \code{dgCMatrix}, local data file or \code{xgb.DMatrix}.} \item{label}{the response variable. User should not set this field, if data is local data file or \code{xgb.DMatrix}.} \item{missing}{Missing is only used when input is dense matrix, pick a float value that represents missing value. Sometimes a data use 0 or other extreme value to represents missing values.} \item{params}{the list of parameters. Commonly used ones are: \itemize{ \item \code{objective} objective function, common ones are \itemize{ \item \code{reg:linear} linear regression \item \code{binary:logistic} logistic regression for classification } \item \code{eta} step size of each boosting step \item \code{max.depth} maximum depth of the tree \item \code{nthread} number of thread used in training, if not set, all threads are used } Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/tqchen/xgboost/wiki/Parameters} for the full list. See also \code{demo/} for walkthrough example in R.} \item{nrounds}{the max number of iterations} \item{verbose}{If 0, xgboost will stay silent. If 1, xgboost will print information of performance. If 2, xgboost will print information of both performance and construction progress information} \item{...}{other parameters to pass to \code{params}.} } \description{ A simple interface for training xgboost model. Look at \code{\link{xgb.train}} function for a more advanced interface. } \details{ This is the modeling function for Xgboost. Parallelization is automatically enabled if \code{OpenMP} is present. Number of threads can also be manually specified via \code{nthread} parameter. } \examples{ data(agaricus.train, package='xgboost') data(agaricus.test, package='xgboost') train <- agaricus.train test <- agaricus.test bst <- xgboost(data = train$data, label = train$label, max.depth = 2, eta = 1, nthread = 2, nround = 2, objective = "binary:logistic") pred <- predict(bst, test$data) }