xgboost/R-package/man/xgb.plot.multi.trees.Rd
Vadim Khotilovich d7406e07f3 [R] xgb.plot.tree fixes (#1939)
* [R] a few fixes and improvements to xgb.plot.tree

* [R] deprecate n_first_tree replace with trees; fix types in xgb.model.dt.tree
2017-01-06 11:09:51 -08:00

60 lines
2.1 KiB
R

% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/xgb.plot.multi.trees.R
\name{xgb.plot.multi.trees}
\alias{xgb.plot.multi.trees}
\title{Project all trees on one tree and plot it}
\usage{
xgb.plot.multi.trees(model, feature_names = NULL, features_keep = 5,
plot_width = NULL, plot_height = NULL, ...)
}
\arguments{
\item{model}{produced by the \code{xgb.train} function.}
\item{feature_names}{names of each feature as a \code{character} vector.}
\item{features_keep}{number of features to keep in each position of the multi trees.}
\item{plot_width}{width in pixels of the graph to produce}
\item{plot_height}{height in pixels of the graph to produce}
\item{...}{currently not used}
}
\value{
Two graphs showing the distribution of the model deepness.
}
\description{
Visualization of the ensemble of trees as a single collective unit.
}
\details{
This function tries to capture the complexity of a gradient boosted tree model
in a cohesive way by compressing an ensemble of trees into a single tree-graph representation.
The goal is to improve the interpretability of a model generally seen as black box.
Note: this function is applicable to tree booster-based models only.
It takes advantage of the fact that the shape of a binary tree is only defined by
its depth (therefore, in a boosting model, all trees have similar shape).
Moreover, the trees tend to reuse the same features.
The function projects each tree onto one, and keeps for each position the
\code{features_keep} first features (based on the Gain per feature measure).
This function is inspired by this blog post:
\url{https://wellecks.wordpress.com/2015/02/21/peering-into-the-black-box-visualizing-lambdamart/}
}
\examples{
data(agaricus.train, package='xgboost')
bst <- xgboost(data = agaricus.train$data, label = agaricus.train$label, max_depth = 15,
eta = 1, nthread = 2, nrounds = 30, objective = "binary:logistic",
min_child_weight = 50)
p <- xgb.plot.multi.trees(model = bst, feature_names = colnames(agaricus.train$data),
features_keep = 3)
print(p)
}