Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

additional fixes for HFQuantizer compatibility #136

Closed
wants to merge 3 commits into from

Conversation

bfineran
Copy link
Contributor

handling of assigning values to meta parameters on load and deserializing quant configs

@bfineran bfineran requested a review from Satrat August 20, 2024 17:00
@bfineran bfineran self-assigned this Aug 20, 2024
horheynm
horheynm previously approved these changes Aug 20, 2024
@@ -205,6 +205,10 @@ def parse_quantization_config(
if hasattr(compression_config, QUANTIZATION_CONFIG_NAME):
# for loaded HFQuantizer config
return getattr(compression_config, QUANTIZATION_CONFIG_NAME)
elif isinstance(compression_config, dict) and (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

already merged

@kylesayrs kylesayrs assigned kylesayrs and unassigned bfineran Oct 2, 2024
@kylesayrs kylesayrs marked this pull request as draft October 4, 2024 17:02
@markurtz
Copy link
Member

markurtz commented Oct 18, 2024

@kylesayrs @dsikka is this diff still needed and in progress given the landing of HF quantizer support?

parameter.data = new_param_data.to(device).to(dtype)
try:
parameter.data = new_param_data.to(device).to(dtype)
except RuntimeError:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is better handled by #193

@kylesayrs
Copy link
Contributor

All the changes here have already been made or have open PRs to fix

@kylesayrs kylesayrs closed this Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants