Skip to content

Conversation

@liangel-02
Copy link

@liangel-02 liangel-02 commented Nov 24, 2025

This PR is a tutorial for a variable length attention API

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 24, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/3660

Note: Links to docs will display an error until the docs builds have been completed.

⏳ No Failures, 30 Pending

As of commit 935d96d with merge base 00d8b24 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the cla signed label Nov 24, 2025
@liangel-02 liangel-02 force-pushed the varlen_tutorial branch 3 times, most recently from a6cb078 to 19a4ad9 Compare November 24, 2025 21:59
@liangel-02 liangel-02 requested a review from drisspg November 25, 2025 01:42
@drisspg
Copy link
Contributor

drisspg commented Nov 25, 2025

@svekars any tips for reviewing this code in a more human friendly way?

@liangel-02 liangel-02 marked this pull request as ready for review December 1, 2025 19:53
@liangel-02 liangel-02 force-pushed the varlen_tutorial branch 3 times, most recently from 46751dd to 38867c9 Compare December 1, 2025 21:10
Copy link
Contributor

@svekars svekars left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please submit this as a .py - all our tutorials are either .py or .rst and .py tutorials are converted into html and .ipynb on build. You can follow this template: https://github.com/pytorch/tutorials/blob/main/beginner_source/template_tutorial.py

@svekars
Copy link
Contributor

svekars commented Dec 1, 2025

There is more info here: https://github.com/pytorch/tutorials#contributing and there is a script that will help you convert your .ipynb to .py

Copy link
Contributor

@svekars svekars left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good overall. Need to add to compilers_index.rst and I added some suggestions that would help it make more discoverable on the website/google. Can also add to the What's new in the PyTorch tutorials in the index.rst.

@svekars svekars added the 2.10 2.10 PyTorch release label label Dec 2, 2025
@svekars
Copy link
Contributor

svekars commented Dec 2, 2025

There is also an error here: https://github.com/pytorch/tutorials/actions/runs/19864783846/job/56924391840?pr=3660#step:9:4582 - let me check how should we test against 2.10.

@liangel-02 liangel-02 force-pushed the varlen_tutorial branch 4 times, most recently from aac18ab to 25db45f Compare December 2, 2025 19:09
@liangel-02 liangel-02 requested a review from svekars December 2, 2025 19:10
@svekars
Copy link
Contributor

svekars commented Jan 5, 2026

@AlannaBurke can you please review this 2.10 tutorial.

@svekars
Copy link
Contributor

svekars commented Jan 6, 2026

@liangel-02 liangel-02 force-pushed the varlen_tutorial branch 2 times, most recently from 057a641 to 52410a5 Compare January 7, 2026 16:53
# return_aux: AuxRequest | None = None,
# ) -> torch.Tensor | tuple[torch.Tensor, torch.Tensor]:

# ``query``, ``key``, and ``value`` correspond to the ``q``, ``k``, and
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# is_causal: bool = False,
# return_aux: AuxRequest | None = None,
# ) -> torch.Tensor | tuple[torch.Tensor, torch.Tensor]:

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
#

# and ``max_k`` are the maximum sequence lengths of query and key,
# respectively. ``is_causal`` applies causal masking if set to True and
# ``return_aux`` specifies which auxiliary outputs to return (ie ``lse``).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
#

# Variable length attention handles sequences of varying length by
# **packing** the tensors in a batch together and essentially collapsing
# the batch dimension.

Copy link
Contributor

@svekars svekars Jan 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes the "Variable length attention" part got fixed but the para below still shows as code block: https://docs-preview.pytorch.org/pytorch/tutorials-nightly-preview/3660/intermediate/variable_length_attention_tutorial.html#overview-of-variable-length-attention

need to make sure, every empty line starts with a #. if you add an empty line the new line needs to have ###### and then your text. The figure below is not showing too. Here is more about sphinx-gallery syntax: https://sphinx-gallery.github.io/stable/syntax.html#embed-rest-in-your-example-python-files

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

2.10 2.10 PyTorch release label cla signed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants