Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tutorial about inductor caching #2952

Merged
merged 1 commit into from
Jun 20, 2024
Merged

Add tutorial about inductor caching #2952

merged 1 commit into from
Jun 20, 2024

Conversation

Copy link

pytorch-bot bot commented Jun 20, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/2952

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

✅ No Failures

As of commit 985d438 with merge base f2b8a1b (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@svekars
Copy link
Contributor

svekars commented Jun 20, 2024

Approved from the publishing perspective but someone with technical expertise should approve as well.

Introduction
------------------

PyTorch Inductor implements several caches to reduce compilation latency. These caches are transparent to the user.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Is "These caches are transparent to the user" relevant here? I'd just remove it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could say something like: These caching mechanisms operate seamlessly, requiring no user intervention.


TORCHINDUCTOR_FX_GRAPH_CACHE
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This setting enables the local FX graph cache feature, i.e., by storing artifacts on the host’s temp directory. ``1`` enables, and any other value disables it. By default, the disk location is per username, but users can enable sharing across usernames by specifying ``TORCHINDUCTOR_CACHE_DIR`` (below).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

on -> in


TORCHINDUCTOR_FX_GRAPH_REMOTE_CACHE
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This setting enables the remote FX graph cache feature. The current implementation uses Redis. ``1`` enables caching, and any other value disables. The following environment variables configure the host and port of the Redis server:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

disables it

ghstack-source-id: 2c70215fd150e2a16abc55fe1ff8b1e7639e50c9
Pull Request resolved: #2951
@oulgen oulgen merged commit 3b97695 into main Jun 20, 2024
20 checks passed
ignaciobartol pushed a commit to ignaciobartol/tutorials that referenced this pull request Jun 24, 2024
ignaciobartol pushed a commit to ignaciobartol/tutorials that referenced this pull request Jun 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants