[Doc] add doc for kill_spark_context_on_worker_failure parameter (#6097)

* [Doc] add doc for kill_spark_context_on_worker_failure parameter

* resolve comments
This commit is contained in:
Bobby Wang 2020-09-10 12:28:44 +08:00 committed by GitHub
parent d0ccb13d09
commit 00b0ad1293
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -16,6 +16,12 @@ This tutorial is to cover the end-to-end process to build a machine learning pip
* Building a Machine Learning Pipeline with XGBoost4J-Spark * Building a Machine Learning Pipeline with XGBoost4J-Spark
* Running XGBoost4J-Spark in Production * Running XGBoost4J-Spark in Production
.. note::
**SparkContext will be stopped by default when XGBoost training task fails**.
XGBoost4J-Spark 1.2.0+ exposes a parameter **kill_spark_context_on_worker_failure**. Set **kill_spark_context_on_worker_failure** to **false** so that the SparkContext will not be stopping on training failure. Instead of stopping the SparkContext, XGBoost4J-Spark will throw an exception instead. Users who want to re-use the SparkContext should wrap the training code in a try-catch block.
.. contents:: .. contents::
:backlinks: none :backlinks: none
:local: :local: