-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatGLM3 全参数微调后,加载checkpoint报错 #1340
Comments
我用Lora微调后合并完了 加载也会报这个错 目前是把合并目录下tokenizer_config.json文件里的几个*_token删掉才能运行 |
删除之后报这个错:assert self.padding_side == "left" AssertionError @AmeowCAT |
需要手动修改 tokenizer config 里面的 padding side 为 left |
@hiyouga 您好,这个项目目前支持chatGLM2嘛,我跑出来以后导出模型也是AttributeError: can't set attribute 'eos_token'这个错误。 tokenizer config 内容如下: |
以下几个参数值不可设定属性,可以注释掉来尝试兼容 `
` |
@CplusHua01 重新训练的话会直接报错 : 是因为我训练的是chatGLM2嘛? |
https://huggingface.co/THUDM/chatglm3-6b-32k/raw/main/tokenization_chatglm.py 重新导出后注意导出后的tokenization_chatglm.py 是否需要修改 |
删掉之后微调效果失效,是哪块做的不对吗 |
tokenizer = AutoTokenizer.from_pretrained(model_file_path, trust_remote_code=True)
AttributeError: can't set attribute 'eos_token'
The text was updated successfully, but these errors were encountered: