Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enforce Lightning module as source of truth for automatic optimization #7130

Merged
merged 10 commits into from
Apr 26, 2021

Conversation

ananthsub
Copy link
Contributor

@ananthsub ananthsub commented Apr 21, 2021

What does this PR do?

This is further cleanup from #4317. This PR deletes the train loop's automatic optimization attribute and enforces all accesses go through the lightning module's property. This ensures we don't have divergence across these fields. Since the lightning module is the sole decider for automatic optimization, it looked very odd that the lightning module itself was referring to this property via self.trainer.train_loop.automatic_optimization even though self.automatic_optimization was already defined. Now that the trainer constructor argument for automatic_optimization is gone, we can simplify this.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@pep8speaks
Copy link

pep8speaks commented Apr 21, 2021

Hello @ananthsub! Thanks for updating this PR.

Line 719:13: W503 line break before binary operator

Comment last updated at 2021-04-26 05:15:17 UTC

@ananthsub ananthsub added this to the v1.3 milestone Apr 21, 2021
@codecov
Copy link

codecov bot commented Apr 21, 2021

Codecov Report

Merging #7130 (4a414dd) into master (44d775f) will decrease coverage by 4%.
The diff coverage is 100%.

@@           Coverage Diff           @@
##           master   #7130    +/-   ##
=======================================
- Coverage      91%     87%    -4%     
=======================================
  Files         198     198            
  Lines       12635   12632     -3     
=======================================
- Hits        11541   11008   -533     
- Misses       1094    1624   +530     

@awaelchli awaelchli added the ready PRs ready to be merged label Apr 21, 2021
Copy link
Contributor

@SeanNaren SeanNaren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @ananthsub :)

@Borda Borda enabled auto-merge (squash) April 21, 2021 23:39
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really neat !

@Borda Borda merged commit 68eac4d into Lightning-AI:master Apr 26, 2021
@ananthsub ananthsub deleted the simplify-auto-opt-module branch April 26, 2021 05:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready PRs ready to be merged refactor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants