Skip to content

Conversation

metascroy
Copy link
Contributor

Adds utility to create an optimizer like this:

quant_configs_and_filter_fns = [
    (QuantConfig(bitwidth=2, group_size=128), use_2bit),
    (QuantConfig(bitwidth=3, group_size=256), use_3bit),
    (QuantConfig(bitwidth=4, group_size=256), use_4bit),
    (QuantConfig(bitwidth=8), use_8bit),
 ]

optimizer = create_optimizer(model, quant_configs_and_filter_fns, base_optimizer_cls, base_optimizer_kwargs)

The filter functions are (module, fqn) -> bool, e.g.,

def use_2bit(m, fqn):
        if not fqn.endswith(".weight"):
            return False

        # Layers 0-3
        for i in range(4):
            if fqn.startswith(f"model.layers.{i}."):
                for key in ["v_proj", "gate_proj", "up_proj"]:
                    if key in fqn:
                        return True

        # Layers 4-31
        for i in range(4, 32):
            if fqn.startswith(f"model.layers.{i}."):
                for key in ["q_proj", "v_proj", "gate_proj", "up_proj", "down_proj"]:
                    if key in fqn:
                        return True

        return False

Copy link

pytorch-bot bot commented Oct 13, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3165

Note: Links to docs will display an error until the docs builds have been completed.

❗ 2 Active SEVs

There are 2 currently active SEVs. If your PR is affected, please view them below:

✅ No Failures

As of commit 9b1df40 with merge base c96f2dd (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 13, 2025
@metascroy metascroy requested a review from lisjin October 13, 2025 16:47
Copy link
Contributor

@lisjin lisjin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! I just had a few nits that might not be important

Comment on lines +46 to +48
# Non-quantized group at end so that index in param_groups
# is the index in the subset of quantized param groups, which is
# used in defining group_quantizer_map
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Standardizing the indices this way is a great idea


def create_optimizer(
model,
quant_configs_and_filter_fns,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add a type hint here?


# If no match, add to no-quant group at last idx
if matching_config is None:
print("NONE")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this for debugging?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After each parameter, it prints out config (bitwidth,groupsize or NONE)

@lisjin lisjin added the topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories) label Oct 13, 2025
@metascroy metascroy merged commit 8878f30 into main Oct 14, 2025
18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants