Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move SparseGPTModifier location with backwards compatibility #919

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

kylesayrs
Copy link
Collaborator

@kylesayrs kylesayrs commented Nov 16, 2024

Purpose

  • Move SparseGPTModifier to a more logical folder to reduce confusion about "optimal brain compression / quantization (obcq)" being used to describe a sparsity-only algorithm but not being used to describe the gptq quantization algorithm

Changes

  • Moved SparseGPTModifier from llmcompressor.modifiers.obcq to llmcompressor.modifiers.pruning.sparsegpt
  • Left a depreciation warning and backwards compatible import
warnings.warn(
    "llmcompressor.modifiers.obcq has been moved to "
    "llmcompressor.modifiers.pruning.sparsegpt Please update your paths",
    DeprecationWarning,
)

TODO

  • Rename/ fix bugs in tests

Testing

  • grepped codebase grep -r 'import SparseGPTModifier' src tests examples
  • on-commit tests

@kylesayrs kylesayrs self-assigned this Nov 16, 2024
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Signed-off-by: Kyle Sayers <kylesayrs@gmail.com>
Signed-off-by: Kyle Sayers <kylesayrs@gmail.com>
@kylesayrs kylesayrs force-pushed the kylesayrs/move-sparsegptq branch from 9d764b8 to a241804 Compare November 18, 2024 19:29
@kylesayrs
Copy link
Collaborator Author

Leaving this in draft until other things land so as to avoid merge conflicts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant