diff --git a/R-package/DESCRIPTION b/R-package/DESCRIPTION index 842fb6010..6de75f930 100644 --- a/R-package/DESCRIPTION +++ b/R-package/DESCRIPTION @@ -15,8 +15,8 @@ Description: Xgboost is short for eXtreme Gradient Boosting, which is an package is made to be extensible, so that users are also allowed to define their own objectives easily. License: Apache License (== 2.0) | file LICENSE -URL: https://github.com/tqchen/xgboost -BugReports: https://github.com/tqchen/xgboost/issues +URL: https://github.com/dmlc/xgboost +BugReports: https://github.com/dmlc/xgboost/issues VignetteBuilder: knitr Suggests: knitr Depends: diff --git a/R-package/R/xgb.dump.R b/R-package/R/xgb.dump.R index 10ac18b47..fae1c7d2b 100644 --- a/R-package/R/xgb.dump.R +++ b/R-package/R/xgb.dump.R @@ -11,9 +11,9 @@ #' @param fname the name of the text file where to save the model text dump. If not provided or set to \code{NULL} the function will return the model as a \code{character} vector. #' @param fmap feature map file representing the type of feature. #' Detailed description could be found at -#' \url{https://github.com/tqchen/xgboost/wiki/Binary-Classification#dump-model}. +#' \url{https://github.com/dmlc/xgboost/wiki/Binary-Classification#dump-model}. #' See demo/ for walkthrough example in R, and -#' \url{https://github.com/tqchen/xgboost/blob/master/demo/data/featmap.txt} +#' \url{https://github.com/dmlc/xgboost/blob/master/demo/data/featmap.txt} #' for example Format. #' @param with.stats whether dump statistics of splits #' When this option is on, the model dump comes with two additional statistics: diff --git a/R-package/R/xgb.train.R b/R-package/R/xgb.train.R index 79ef3b4a1..d5cf5cbde 100644 --- a/R-package/R/xgb.train.R +++ b/R-package/R/xgb.train.R @@ -86,7 +86,7 @@ #' \item \code{ndcg} Normalized Discounted Cumulative Gain. \url{http://en.wikipedia.org/wiki/NDCG} #' } #' -#' Full list of parameters is available in the Wiki \url{https://github.com/tqchen/xgboost/wiki/Parameters}. +#' Full list of parameters is available in the Wiki \url{https://github.com/dmlc/xgboost/wiki/Parameters}. #' #' This function only accepts an \code{\link{xgb.DMatrix}} object as the input. #' diff --git a/R-package/R/xgboost.R b/R-package/R/xgboost.R index 1007cb91d..ede53b116 100644 --- a/R-package/R/xgboost.R +++ b/R-package/R/xgboost.R @@ -20,7 +20,7 @@ #' \item \code{nthread} number of thread used in training, if not set, all threads are used #' } #' -#' Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/tqchen/xgboost/wiki/Parameters} for the full list. +#' Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/dmlc/xgboost/wiki/Parameters} for the full list. #' #' See also \code{demo/} for walkthrough example in R. #' diff --git a/R-package/man/xgb.dump.Rd b/R-package/man/xgb.dump.Rd index 3c074928d..124535211 100644 --- a/R-package/man/xgb.dump.Rd +++ b/R-package/man/xgb.dump.Rd @@ -13,9 +13,9 @@ xgb.dump(model = NULL, fname = NULL, fmap = "", with.stats = FALSE) \item{fmap}{feature map file representing the type of feature. Detailed description could be found at -\url{https://github.com/tqchen/xgboost/wiki/Binary-Classification#dump-model}. +\url{https://github.com/dmlc/xgboost/wiki/Binary-Classification#dump-model}. See demo/ for walkthrough example in R, and -\url{https://github.com/tqchen/xgboost/blob/master/demo/data/featmap.txt} +\url{https://github.com/dmlc/xgboost/blob/master/demo/data/featmap.txt} for example Format.} \item{with.stats}{whether dump statistics of splits diff --git a/R-package/man/xgb.train.Rd b/R-package/man/xgb.train.Rd index 1c4376388..d56f0b84e 100644 --- a/R-package/man/xgb.train.Rd +++ b/R-package/man/xgb.train.Rd @@ -99,7 +99,7 @@ Number of threads can also be manually specified via \code{nthread} parameter. \item \code{ndcg} Normalized Discounted Cumulative Gain. \url{http://en.wikipedia.org/wiki/NDCG} } -Full list of parameters is available in the Wiki \url{https://github.com/tqchen/xgboost/wiki/Parameters}. +Full list of parameters is available in the Wiki \url{https://github.com/dmlc/xgboost/wiki/Parameters}. This function only accepts an \code{\link{xgb.DMatrix}} object as the input. } diff --git a/R-package/man/xgboost.Rd b/R-package/man/xgboost.Rd index a0ec42a9b..50e7b52c2 100644 --- a/R-package/man/xgboost.Rd +++ b/R-package/man/xgboost.Rd @@ -31,7 +31,7 @@ Commonly used ones are: \item \code{nthread} number of thread used in training, if not set, all threads are used } - Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/tqchen/xgboost/wiki/Parameters} for the full list. + Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/dmlc/xgboost/wiki/Parameters} for the full list. See also \code{demo/} for walkthrough example in R.} diff --git a/R-package/vignettes/discoverYourData.Rmd b/R-package/vignettes/discoverYourData.Rmd index 78df67d4e..49d5bf0cd 100644 --- a/R-package/vignettes/discoverYourData.Rmd +++ b/R-package/vignettes/discoverYourData.Rmd @@ -17,7 +17,7 @@ Introduction The purpose of this Vignette is to show you how to use **Xgboost** to discover and understand your own dataset better. -This Vignette is not about predicting anything (see [Xgboost presentation](https://github.com/tqchen/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)). We will explain how to use **Xgboost** to highlight the *link* between the *features* of your data and the *outcome*. +This Vignette is not about predicting anything (see [Xgboost presentation](https://github.com/dmlc/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)). We will explain how to use **Xgboost** to highlight the *link* between the *features* of your data and the *outcome*. Pacakge loading: @@ -34,7 +34,7 @@ Preparation of the dataset ========================== Numeric VS categorical variables ----------------------------------- +-------------------------------- **Xgboost** manages only `numeric` vectors. @@ -163,7 +163,7 @@ output_vector = df[,Improved] == "Marked" Build the model =============== -The code below is very usual. For more information, you can look at the documentation of `xgboost` function (or at the vignette [Xgboost presentation](https://github.com/tqchen/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)). +The code below is very usual. For more information, you can look at the documentation of `xgboost` function (or at the vignette [Xgboost presentation](https://github.com/dmlc/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)). ```{r} bst <- xgboost(data = sparse_matrix, label = output_vector, max.depth = 4, diff --git a/R-package/vignettes/xgboostPresentation.Rmd b/R-package/vignettes/xgboostPresentation.Rmd index 6e0ca3771..0bab9a1f4 100644 --- a/R-package/vignettes/xgboostPresentation.Rmd +++ b/R-package/vignettes/xgboostPresentation.Rmd @@ -27,7 +27,7 @@ It is an efficient and scalable implementation of gradient boosting framework by It supports various objective functions, including *regression*, *classification* and *ranking*. The package is made to be extendible, so that users are also allowed to define their own objective functions easily. -It has been [used](https://github.com/tqchen/xgboost) to win several [Kaggle](http://www.kaggle.com) competitions. +It has been [used](https://github.com/dmlc/xgboost) to win several [Kaggle](http://www.kaggle.com) competitions. It has several features: @@ -49,7 +49,7 @@ Github version For up-to-date version (highly recommended), install from *Github*: ```{r installGithub, eval=FALSE} -devtools::install_github('tqchen/xgboost', subdir='R-package') +devtools::install_github('dmlc/xgboost', subdir='R-package') ``` > *Windows* user will need to install [RTools](http://cran.r-project.org/bin/windows/Rtools/) first. diff --git a/demo/kaggle-otto/otto_train_pred.R b/demo/kaggle-otto/otto_train_pred.R similarity index 100% rename from demo/kaggle-otto/otto_train_pred.R rename to demo/kaggle-otto/otto_train_pred.R