Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snapshot selected globals and restore them in spawned process #13921

Merged
merged 17 commits into from
Aug 1, 2022

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Jul 29, 2022

What does this PR do?

Fixes #12685

The problem: Globals like torch deterministic flags or the seed get lost when spawning processes. This is caused because essentially the torch module gets re-imported in the child process, leading to globals being re-initialized. This is simply the nature of how mp.spawn works. This means it only affects ddp spawn (multiprocessing start method "spawn").

  • Subprocess launch is not affected because all processes start from an identical state by design
  • Fork is not affected because all children inherit the exact memory of the parent process

We can solve this by picking the globals we care about, save them, send them to the child process and restore them there before we launch the trainer run methods. Hence, this PR implements this directly in the launcher.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

I made sure I had fun coding 🙃

cc @Borda @justusschock @kaushikb11 @awaelchli @akihironitta @rohitgr7

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Jul 29, 2022
@awaelchli awaelchli added this to the pl:1.7.x milestone Jul 29, 2022
@awaelchli awaelchli added bug Something isn't working strategy: ddp spawn labels Jul 29, 2022
@awaelchli awaelchli marked this pull request as ready for review July 29, 2022 15:50
@awaelchli awaelchli self-assigned this Jul 29, 2022
Copy link
Contributor

@akihironitta akihironitta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@mergify mergify bot added ready PRs ready to be merged and removed has conflicts ready PRs ready to be merged labels Aug 1, 2022
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

@awaelchli awaelchli enabled auto-merge (squash) August 1, 2022 15:19
@codecov
Copy link

codecov bot commented Aug 1, 2022

Codecov Report

Merging #13921 (eab9c16) into master (98f7326) will increase coverage by 15%.
The diff coverage is 93%.

@@            Coverage Diff            @@
##           master   #13921     +/-   ##
=========================================
+ Coverage      61%      76%    +15%     
=========================================
  Files         335      335             
  Lines       26305    26368     +63     
=========================================
+ Hits        16048    20137   +4089     
+ Misses      10257     6231   -4026     

@awaelchli awaelchli merged commit eb233ea into master Aug 1, 2022
@awaelchli awaelchli deleted the bugfix/spawn-globals-2 branch August 1, 2022 22:21
@awaelchli awaelchli modified the milestones: pl:1.7.x, pl:1.7 Aug 1, 2022
@awaelchli awaelchli added strategy: ddp DistributedDataParallel and removed strategy: ddp spawn labels Nov 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working pl Generic label for PyTorch Lightning package ready PRs ready to be merged strategy: ddp DistributedDataParallel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Deterministic flags and other globals do not get transferred to spawned processes
7 participants