Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Scheme UX for QuantizationModifier #9

Merged
merged 3 commits into from
Jun 27, 2024
Merged

Conversation

Satrat
Copy link
Contributor

@Satrat Satrat commented Jun 25, 2024

Previously only GPTQModifier supported scheme presets: GPTQModifier(targets="Linear", scheme="FP8"). This PR moves the logic for resolving schemes to the QuantizationModifier, so we can now call QuantizationModifier(targets="Linear", scheme="FP8") rather than using the config_groups parameter

@Satrat Satrat merged commit ac6c027 into main Jun 27, 2024
7 of 12 checks passed
@Satrat Satrat deleted the sa/qm_scheme_support branch June 27, 2024 14:16
markmc pushed a commit to markmc/llm-compressor that referenced this pull request Nov 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant