-
Notifications
You must be signed in to change notification settings - Fork 374
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
模型并行问题 #82
Comments
没解决~,但是过几天将发布一个新的lora训练代码(支持多卡进行 |
mymusise/ChatGLM-Tuning#59 (comment) |
@yuanzhoulvpi2017 |
你是不是用的还是老的模型文件?要用最新的 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
训练chatglm-6b模型,可以使用模型并行的方式了!!! 请点击链接查看Chatglm6b_ModelParallel,目前这个版本,虽然在训练的过程中,loss下降了,但是模型学习不到内容,这个问题我还在排查。
请问这个问题解决了吗
The text was updated successfully, but these errors were encountered: