update links dmlc

This commit is contained in:
El Potaeto 2015-03-22 16:41:05 +01:00
parent 70045c41f9
commit 7d0ac3a3dd
10 changed files with 15 additions and 15 deletions

View File

@ -15,8 +15,8 @@ Description: Xgboost is short for eXtreme Gradient Boosting, which is an
package is made to be extensible, so that users are also allowed to define
their own objectives easily.
License: Apache License (== 2.0) | file LICENSE
URL: https://github.com/tqchen/xgboost
BugReports: https://github.com/tqchen/xgboost/issues
URL: https://github.com/dmlc/xgboost
BugReports: https://github.com/dmlc/xgboost/issues
VignetteBuilder: knitr
Suggests: knitr
Depends:

View File

@ -11,9 +11,9 @@
#' @param fname the name of the text file where to save the model text dump. If not provided or set to \code{NULL} the function will return the model as a \code{character} vector.
#' @param fmap feature map file representing the type of feature.
#' Detailed description could be found at
#' \url{https://github.com/tqchen/xgboost/wiki/Binary-Classification#dump-model}.
#' \url{https://github.com/dmlc/xgboost/wiki/Binary-Classification#dump-model}.
#' See demo/ for walkthrough example in R, and
#' \url{https://github.com/tqchen/xgboost/blob/master/demo/data/featmap.txt}
#' \url{https://github.com/dmlc/xgboost/blob/master/demo/data/featmap.txt}
#' for example Format.
#' @param with.stats whether dump statistics of splits
#' When this option is on, the model dump comes with two additional statistics:

View File

@ -86,7 +86,7 @@
#' \item \code{ndcg} Normalized Discounted Cumulative Gain. \url{http://en.wikipedia.org/wiki/NDCG}
#' }
#'
#' Full list of parameters is available in the Wiki \url{https://github.com/tqchen/xgboost/wiki/Parameters}.
#' Full list of parameters is available in the Wiki \url{https://github.com/dmlc/xgboost/wiki/Parameters}.
#'
#' This function only accepts an \code{\link{xgb.DMatrix}} object as the input.
#'

View File

@ -20,7 +20,7 @@
#' \item \code{nthread} number of thread used in training, if not set, all threads are used
#' }
#'
#' Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/tqchen/xgboost/wiki/Parameters} for the full list.
#' Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/dmlc/xgboost/wiki/Parameters} for the full list.
#'
#' See also \code{demo/} for walkthrough example in R.
#'

View File

@ -13,9 +13,9 @@ xgb.dump(model = NULL, fname = NULL, fmap = "", with.stats = FALSE)
\item{fmap}{feature map file representing the type of feature.
Detailed description could be found at
\url{https://github.com/tqchen/xgboost/wiki/Binary-Classification#dump-model}.
\url{https://github.com/dmlc/xgboost/wiki/Binary-Classification#dump-model}.
See demo/ for walkthrough example in R, and
\url{https://github.com/tqchen/xgboost/blob/master/demo/data/featmap.txt}
\url{https://github.com/dmlc/xgboost/blob/master/demo/data/featmap.txt}
for example Format.}
\item{with.stats}{whether dump statistics of splits

View File

@ -99,7 +99,7 @@ Number of threads can also be manually specified via \code{nthread} parameter.
\item \code{ndcg} Normalized Discounted Cumulative Gain. \url{http://en.wikipedia.org/wiki/NDCG}
}
Full list of parameters is available in the Wiki \url{https://github.com/tqchen/xgboost/wiki/Parameters}.
Full list of parameters is available in the Wiki \url{https://github.com/dmlc/xgboost/wiki/Parameters}.
This function only accepts an \code{\link{xgb.DMatrix}} object as the input.
}

View File

@ -31,7 +31,7 @@ Commonly used ones are:
\item \code{nthread} number of thread used in training, if not set, all threads are used
}
Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/tqchen/xgboost/wiki/Parameters} for the full list.
Look at \code{\link{xgb.train}} for a more complete list of parameters or \url{https://github.com/dmlc/xgboost/wiki/Parameters} for the full list.
See also \code{demo/} for walkthrough example in R.}

View File

@ -17,7 +17,7 @@ Introduction
The purpose of this Vignette is to show you how to use **Xgboost** to discover and understand your own dataset better.
This Vignette is not about predicting anything (see [Xgboost presentation](https://github.com/tqchen/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)). We will explain how to use **Xgboost** to highlight the *link* between the *features* of your data and the *outcome*.
This Vignette is not about predicting anything (see [Xgboost presentation](https://github.com/dmlc/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)). We will explain how to use **Xgboost** to highlight the *link* between the *features* of your data and the *outcome*.
Pacakge loading:
@ -34,7 +34,7 @@ Preparation of the dataset
==========================
Numeric VS categorical variables
----------------------------------
--------------------------------
**Xgboost** manages only `numeric` vectors.
@ -163,7 +163,7 @@ output_vector = df[,Improved] == "Marked"
Build the model
===============
The code below is very usual. For more information, you can look at the documentation of `xgboost` function (or at the vignette [Xgboost presentation](https://github.com/tqchen/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)).
The code below is very usual. For more information, you can look at the documentation of `xgboost` function (or at the vignette [Xgboost presentation](https://github.com/dmlc/xgboost/blob/master/R-package/vignettes/xgboostPresentation.Rmd)).
```{r}
bst <- xgboost(data = sparse_matrix, label = output_vector, max.depth = 4,

View File

@ -27,7 +27,7 @@ It is an efficient and scalable implementation of gradient boosting framework by
It supports various objective functions, including *regression*, *classification* and *ranking*. The package is made to be extendible, so that users are also allowed to define their own objective functions easily.
It has been [used](https://github.com/tqchen/xgboost) to win several [Kaggle](http://www.kaggle.com) competitions.
It has been [used](https://github.com/dmlc/xgboost) to win several [Kaggle](http://www.kaggle.com) competitions.
It has several features:
@ -49,7 +49,7 @@ Github version
For up-to-date version (highly recommended), install from *Github*:
```{r installGithub, eval=FALSE}
devtools::install_github('tqchen/xgboost', subdir='R-package')
devtools::install_github('dmlc/xgboost', subdir='R-package')
```
> *Windows* user will need to install [RTools](http://cran.r-project.org/bin/windows/Rtools/) first.