[doc] remove the doc about killing SparkContext [skip ci] (#7840)
This commit is contained in:
parent
dc2e699656
commit
bef1f939ce
@ -16,12 +16,6 @@ This tutorial is to cover the end-to-end process to build a machine learning pip
|
||||
* Building a Machine Learning Pipeline with XGBoost4J-Spark
|
||||
* Running XGBoost4J-Spark in Production
|
||||
|
||||
.. note::
|
||||
|
||||
**SparkContext will be stopped by default when XGBoost training task fails**.
|
||||
|
||||
XGBoost4J-Spark 1.2.0+ exposes a parameter **kill_spark_context_on_worker_failure**. Set **kill_spark_context_on_worker_failure** to **false** so that the SparkContext will not be stopping on training failure. Instead of stopping the SparkContext, XGBoost4J-Spark will throw an exception instead. Users who want to re-use the SparkContext should wrap the training code in a try-catch block.
|
||||
|
||||
.. contents::
|
||||
:backlinks: none
|
||||
:local:
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user