Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve nn.scan docs #2839

Merged
merged 1 commit into from
Mar 2, 2023
Merged

Improve nn.scan docs #2839

merged 1 commit into from
Mar 2, 2023

Conversation

cgarciae
Copy link
Collaborator

@cgarciae cgarciae commented Feb 2, 2023

What does this PR do?

Fixes #2754. Improves the scan documentation in the following ways:

  • Improves the current LSTM example to make it more readable.
  • Adds a new example showing how to scan over layers.
  • Adds a reference to remat_scan

@cgarciae cgarciae force-pushed the fix-2754 branch 2 times, most recently from 1c37982 to 1b3af66 Compare February 2, 2023 18:35
@codecov-commenter
Copy link

codecov-commenter commented Feb 2, 2023

Codecov Report

Merging #2839 (a5f4e6f) into main (dfb55c4) will decrease coverage by 0.01%.
The diff coverage is 0.00%.

@@            Coverage Diff             @@
##             main    #2839      +/-   ##
==========================================
- Coverage   81.37%   81.36%   -0.01%     
==========================================
  Files          53       53              
  Lines        5717     5714       -3     
==========================================
- Hits         4652     4649       -3     
  Misses       1065     1065              
Impacted Files Coverage Δ
flax/jax_utils.py 42.85% <0.00%> (ø)
flax/linen/transforms.py 94.14% <ø> (ø)
flax/linen/linear.py 97.54% <0.00%> (-0.02%) ⬇️
flax/linen/__init__.py 100.00% <0.00%> (ø)
flax/linen/recurrent.py 100.00% <0.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@cgarciae cgarciae changed the title add scan over layers example Improve nn.scan docs Feb 2, 2023
... h = nn.relu(h)
... return x + h, None
...
>>> class RecidualMLP(nn.Module):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that there is a typo in RecidualMLP.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes! Thanks :)

@copybara-service copybara-service bot merged commit be3c846 into google:main Mar 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Initialization of Submodules Lifted with flax.nn.scan
4 participants