% Generated by roxygen2 (4.0.1): do not edit by hand \name{xgboost} \alias{xgboost} \title{eXtreme Gradient Boosting (Tree) library} \usage{ xgboost(data = NULL, label = NULL, params = list(), nrounds, verbose = 1, ...) } \arguments{ \item{data}{takes \code{matrix}, \code{dgCMatrix}, local data file or \code{xgb.DMatrix}.} \item{label}{the response variable. User should not set this field,} \item{params}{the list of parameters. Commonly used ones are: objective: objective function, common ones are - reg:linear linear regression - binary:logistic logistic regression for classification eta: step size of each boosting step max_depth: maximum depth of the tree nthread: number of thread used in training, if not set, all threads are used See \url{https://github.com/tqchen/xgboost/wiki/Parameters} for further details. See also demo/demo.R for walkthrough example in R.} \item{nrounds}{the max number of iterations} \item{verbose}{If 0, xgboost will stay silent. If 1, xgboost will print information of performance. If 2, xgboost will print information of both performance and construction progress information} \item{...}{other parameters to pass to \code{params}.} } \description{ A simple interface for xgboost in R } \details{ This is the modeling function for xgboost. Parallelization is automatically enabled if OpenMP is present. Number of threads can also be manually specified via "nthread" parameter } \examples{ data(iris) bst <- xgboost(as.matrix(iris[,1:4]),as.numeric(iris[,5]), nrounds = 2) pred <- predict(bst, as.matrix(iris[,1:4])) }