Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a memory profiling example #41

Merged
merged 6 commits into from
Jun 29, 2022

Conversation

sanketpurandare
Copy link
Contributor

@sanketpurandare sanketpurandare commented Jun 21, 2022

This example shows how to use torch_dispatch for memory profiling
It shows two ways

  1. Profiling the forward and backward pass of model
  2. Profiling a any piece of code
    It also provides utilities to plot the memory profile and compare it
    AlbertForMaskedLM_mem_usage
    autograd_mem_usage

@sanketpurandare sanketpurandare changed the title Adding a new example Adding a memory profiling example Jun 21, 2022
@albanD
Copy link
Owner

albanD commented Jun 22, 2022

Hey!
For the CI to run, you can add your new dependencies to requirements.txt (torch nightly is always there and doesn't need to be listed there)

def reduce_to_scalar_loss(inp):
return inp.sum()

from functorch.compile import aot_module, nop, print_compile, min_cut_rematerialization_partition

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you move this to the top? Also try running black formatter just for consistent styling.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The lint should already check that everything is black compliant:

include_patterns = ['**/*.py']

Since it passes on the PR, I guess it is good.

Copy link
Owner

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice utility!


def __torch_dispatch__(self, func, types, args=..., kwargs=None):

global mem_usage, operator_names
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You don't need this since you only write into these dictionaries.
Same for the other functions below except the ones that do set mem_usage.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cleaned up global usage

@albanD
Copy link
Owner

albanD commented Jun 22, 2022

Could you add the new requirements so that we can merge this?

@sanketpurandare
Copy link
Contributor Author

Could you add the new requirements so that we can merge this?

Added them

@albanD
Copy link
Owner

albanD commented Jun 23, 2022

Python builtin Modules don't need to be in there ;)

@anijain2305
Copy link

@albanD Can we merge this PR if it looks good?

@Chillee Chillee merged commit cba710e into albanD:main Jun 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants