Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

model.save_pretrained("waimai_10k_bert") 保存模型报错 #98

Open
haozaiiii opened this issue Jul 29, 2024 · 2 comments
Open

model.save_pretrained("waimai_10k_bert") 保存模型报错 #98

haozaiiii opened this issue Jul 29, 2024 · 2 comments

Comments

@haozaiiii
Copy link

model.save_pretrained("waimai_10k_bert")

ValueError: You are trying to save a non contiguous tensor: bert.encoder.layer.0.attention.self.query.weight which is not allowed. It either means you are trying to save tensors which are reference of each other in which case it's recommended to save only the full tensors, and reslice at load time, or simply call .contiguous() on your tensor to pack it before saving.

@haozaiiii
Copy link
Author

transformers是哪个版本,方便把依赖requierments.txt发一下吗

@duanfa
Copy link

duanfa commented Sep 25, 2024

@odunola499
image
这个可以解决我的类似问题,你试试

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants