From 0e0849fa1eb99890524b692c679dfb54d7bee781 Mon Sep 17 00:00:00 2001 From: Jiaming Yuan Date: Mon, 14 Oct 2019 03:44:08 -0400 Subject: [PATCH] Mention dask in readme. [skip ci] (#4942) --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index ec060ac95..5261e7ed2 100644 --- a/README.md +++ b/README.md @@ -17,7 +17,7 @@ XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***. It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. -The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI) and can solve problems beyond billions of examples. +The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples. License ------- @@ -38,7 +38,7 @@ Sponsors Become a sponsor and get a logo here. See details at [Sponsoring the XGBoost Project](https://xgboost.ai/sponsors). The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net). ## Open Source Collective sponsors -[![Backers on Open Collective](https://opencollective.com/xgboost/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/xgboost/sponsors/badge.svg)](#sponsors) +[![Backers on Open Collective](https://opencollective.com/xgboost/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/xgboost/sponsors/badge.svg)](#sponsors) ### Sponsors [[Become a sponsor](https://opencollective.com/xgboost#sponsor)]