Fix broken links. (#6455)

Co-authored-by: Hao Ziyu <haoziyu@qiyi.com>
Co-authored-by: fis <jm.yuan@outlook.com>
This commit is contained in:
hzy001 2020-12-02 17:39:12 +08:00 committed by GitHub
parent 927c316aeb
commit c2ba4fb957
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
4 changed files with 4 additions and 7 deletions

View File

@ -62,7 +62,7 @@ test:data = "agaricus.txt.test"
We use the tree booster and logistic regression objective in our setting. This indicates that we accomplish our task using classic gradient boosting regression tree(GBRT), which is a promising method for binary classification. We use the tree booster and logistic regression objective in our setting. This indicates that we accomplish our task using classic gradient boosting regression tree(GBRT), which is a promising method for binary classification.
The parameters shown in the example gives the most common ones that are needed to use xgboost. The parameters shown in the example gives the most common ones that are needed to use xgboost.
If you are interested in more parameter settings, the complete parameter settings and detailed descriptions are [here](../../doc/parameter.rst). Besides putting the parameters in the configuration file, we can set them by passing them as arguments as below: If you are interested in more parameter settings, the complete parameter settings and detailed descriptions are [here](https://xgboost.readthedocs.io/en/stable/parameter.html). Besides putting the parameters in the configuration file, we can set them by passing them as arguments as below:
``` ```
../../xgboost mushroom.conf max_depth=6 ../../xgboost mushroom.conf max_depth=6
@ -161,4 +161,3 @@ Eg. ```nthread=10```
Set nthread to be the number of your real cpu (On Unix, this can be found using ```lscpu```) Set nthread to be the number of your real cpu (On Unix, this can be found using ```lscpu```)
Some systems will have ```Thread(s) per core = 2```, for example, a 4 core cpu with 8 threads, in such case set ```nthread=4``` and not 8. Some systems will have ```Thread(s) per core = 2```, for example, a 4 core cpu with 8 threads, in such case set ```nthread=4``` and not 8.

View File

@ -14,4 +14,3 @@ objective = reg:squarederror
``` ```
The input format is same as binary classification, except that the label is now the target regression values. We use linear regression here, if we want use objective = reg:logistic logistic regression, the label needed to be pre-scaled into [0,1]. The input format is same as binary classification, except that the label is now the target regression values. We use linear regression here, if we want use objective = reg:logistic logistic regression, the label needed to be pre-scaled into [0,1].

View File

@ -60,9 +60,9 @@ This is a list of short codes introducing different functionalities of xgboost p
Most of examples in this section are based on CLI or python version. Most of examples in this section are based on CLI or python version.
However, the parameter settings can be applied to all versions However, the parameter settings can be applied to all versions
- [Binary classification](binary_classification) - [Binary classification](CLI/binary_classification)
- [Multiclass classification](multiclass_classification) - [Multiclass classification](multiclass_classification)
- [Regression](regression) - [Regression](CLI/regression)
- [Learning to Rank](rank) - [Learning to Rank](rank)
### Benchmarks ### Benchmarks

View File

@ -2,7 +2,6 @@
Introduction to Boosted Trees Introduction to Boosted Trees
############################# #############################
XGBoost stands for "Extreme Gradient Boosting", where the term "Gradient Boosting" originates from the paper *Greedy Function Approximation: A Gradient Boosting Machine*, by Friedman. XGBoost stands for "Extreme Gradient Boosting", where the term "Gradient Boosting" originates from the paper *Greedy Function Approximation: A Gradient Boosting Machine*, by Friedman.
This is a tutorial on gradient boosted trees, and most of the content is based on `these slides <http://homes.cs.washington.edu/~tqchen/pdf/BoostedTree.pdf>`_ by Tianqi Chen, the original author of XGBoost.
The **gradient boosted trees** has been around for a while, and there are a lot of materials on the topic. The **gradient boosted trees** has been around for a while, and there are a lot of materials on the topic.
This tutorial will explain boosted trees in a self-contained and principled way using the elements of supervised learning. This tutorial will explain boosted trees in a self-contained and principled way using the elements of supervised learning.