Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

convert-gptq-to-ggml:KeyError: 'model.layers.0.self_attn.q_proj.zeros' #954

Closed
iMountTai opened this issue Apr 14, 2023 · 4 comments
Closed
Labels

Comments

@iMountTai
Copy link

Cool job, but I am getting the following error after running convert-gptq-to-ggml.py.
image

@iMountTai
Copy link
Author

I made the following changes
image
but
image

@prusnak
Copy link
Collaborator

prusnak commented Apr 14, 2023

try the new convert.py script that is now in master

@iMountTai
Copy link
Author

@prusnak Thanks! But hint

Error: Input uses the newer GPTQ-for-LLaMa format (using g_idx), which is not yet natively supported by GGML.  For now you can still convert this model by passing `--outtype f16` to dequantize, but that will result in a much larger output file for no quality benefit

@github-actions github-actions bot added the stale label Mar 25, 2024
Copy link
Contributor

This issue was closed because it has been inactive for 14 days since being marked as stale.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants