Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[dask] Use distributed.MultiLock #6743

Merged
merged 12 commits into from
Mar 16, 2021
Merged

Conversation

trivialfis
Copy link
Member

This enables training multiple models in parallel.

  • Conditionally import MultiLock.
  • Use async train directly in scikit learn interface.
  • Use worker_client when available.

Close #6649 Close #6677 .

Currently depends on dask/distributed#4503 .

@trivialfis trivialfis mentioned this pull request Mar 5, 2021
23 tasks
@trivialfis
Copy link
Member Author

This enables training multiple models in parallel.

* Conditionally import `MultiLock`.
* Use async train directly in scikit learn interface.
* Use `worker_client` when available.
Copy link
Member

@RAMitchell RAMitchell left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, do you want to add some warning about this behaviour on the docs?

tests/python/test_with_dask.py Show resolved Hide resolved
@codecov-io
Copy link

codecov-io commented Mar 16, 2021

Codecov Report

Merging #6743 (9953a4a) into master (e489411) will decrease coverage by 0.12%.
The diff coverage is 80.95%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #6743      +/-   ##
==========================================
- Coverage   81.83%   81.70%   -0.13%     
==========================================
  Files          13       13              
  Lines        3809     3849      +40     
==========================================
+ Hits         3117     3145      +28     
- Misses        692      704      +12     
Impacted Files Coverage Δ
python-package/xgboost/dask.py 81.90% <80.95%> (-0.68%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 366f3cb...9953a4a. Read the comment docs.

@trivialfis trivialfis merged commit 325bc93 into dmlc:master Mar 16, 2021
@trivialfis trivialfis deleted the dask-multi-lock branch March 16, 2021 06:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dask XGBoost hangs during training with multiple GPU workers
3 participants