Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

运行时报错:MoRA/peft-mora/src/peft/tuners/lora /layer.py文件中有俩个小bug #17

Open
chen-c-alt opened this issue Aug 31, 2024 · 1 comment

Comments

@chen-c-alt
Copy link

修改:
1.在源文件261行前加入:
while pad_size > in_f :
x = torch.cat([x, x[..., :]], dim=-1)
pad_size-=in_f
这是由于没考虑pad_size > in_f得情况导致。

2.在源文件293行代码替换为:
if out_x.numel() == 0:
out_x = out_x.view(*x.shape[:-1], out_f)
else :
out_x = out_x.view(*x.shape[:-1], -1)[..., :out_f]
这是由于未考虑 out_x 中element=0得情况。

@kongds
Copy link
Owner

kongds commented Sep 1, 2024

感谢您的反馈,不过您提到的情况都比较特殊。

  1. 如果pad_size>in_f, mora的r就超过了in_f。这种情况下可能失去使用mora的必要了
  2. 另外out_x中element=0的情况,out_x是大小为0的tensor,并不会出现在一般情况下,且out_x对后续的操作也会有异常。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@kongds @chen-c-alt and others