Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GPTQ Gradual Activation Quantization #1210

Merged
merged 6 commits into from
Sep 17, 2024
Merged

Add GPTQ Gradual Activation Quantization #1210

merged 6 commits into from
Sep 17, 2024

Conversation

irenaby
Copy link
Collaborator

@irenaby irenaby commented Sep 11, 2024

Pull Request Description:

  • add GradualActivationQuantizerWrapper
  • add gradual_activation_quantization option to GPTQ config with configurable linear annealing
  • extract linear annealing scheduler to training infrastructure
  • align soft rounding regularization and gradual activation usage

Checklist before requesting a review:

  • I set the appropriate labels on the pull request.
  • I have added/updated the release note draft (if necessary).
  • I have updated the documentation to reflect my changes (if necessary).
  • All function and files are well documented.
  • All function and classes have type hints.
  • There is a licenses in all file.
  • The function and variable names are informative.
  • I have checked for code duplications.
  • I have added new unittest (if necessary).

@irenaby irenaby marked this pull request as ready for review September 11, 2024 13:15
@irenaby irenaby merged commit d35e816 into main Sep 17, 2024
35 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants