Support adaptive tree, a feature supported by both sklearn and lightgbm. The tree leaf is recomputed based on residue of labels and predictions after construction. For l1 error, the optimal value is the median (50 percentile). This is marked as experimental support for the following reasons: - The value is not well defined for distributed training, where we might have empty leaves for local workers. Right now I just use the original leaf value for computing the average with other workers, which might cause significant errors. - Some follow-ups are required, for exact, pruner, and optimization for quantile function. Also, we need to calculate the initial estimation.
This folder contains test cases for XGBoost c++ core, Python package and some other CI facilities.
Directories
- ci_build: Test facilities for Jenkins CI and GitHub action.
- cli: Basic test for command line executable
xgboost. Most of the other command line specific tests are in Python testtest_cli.py - cpp: Tests for C++ core, using Google test framework.
- python: Tests for Python package, demonstrations and CLI. For how to setup the
dependencies for tests, see conda files in
ci_build. - python-gpu: Similar to python tests, but for GPU.
- travis: CI facilities for Travis.
- distributed: Legacy tests for distributed system. Most of the distributed tests are
in Python tests using
daskand jvm package usingspark. - benchmark: Legacy benchmark code. There are a number of benchmark projects for XGBoost with much better configurations.
Others
- pytest.ini: Describes the
pytestmarker for python tests, some markers are generated byconftest.pyfile.