Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'Embedding' object has no attribute 'device' for MPT #1082

Closed
junzhang-zj opened this issue Nov 6, 2023 · 11 comments · Fixed by #1144
Closed

'Embedding' object has no attribute 'device' for MPT #1082

junzhang-zj opened this issue Nov 6, 2023 · 11 comments · Fixed by #1144

Comments

@junzhang-zj
Copy link

junzhang-zj commented Nov 6, 2023

I followed the test process of MPT notebook, but encountered an error during target training: 'Embedding' object has no attribute 'device'. I want to ask how to solve this.

@junzhang-zj
Copy link
Author

junzhang-zj commented Nov 6, 2023

Detail:
bug

@junzhang-zj junzhang-zj changed the title 'Embedding' object has no attribute 'device' 'Embedding' object has no attribute 'device' for MPT Nov 6, 2023
@junzhang-zj
Copy link
Author

I believe the bug is in 'self.word_embeddings = transformer_backbone.get_submodule(named_param.replace(".weight", ""))' in peft_model, there don't have 'device', 'self.word_embeddings.device = self.device' may need to be added.

@BenjaminBossan
Copy link
Member

I can reproduce the error. The solution in PEFT would be to change this line:

map_location=word_embeddings.device,

to map_location=word_embeddings.weight.device.

@junzhang-zj
Copy link
Author

@BenjaminBossan Thanks!

@BenjaminBossan
Copy link
Member

Let's keep this issue open, as we should fix this issue in PEFT :)

@BenjaminBossan BenjaminBossan reopened this Nov 6, 2023
@junzhang-zj
Copy link
Author

Let's keep this issue open, as we should fix this issue in PEFT :)
ok👌

BenjaminBossan added a commit to BenjaminBossan/peft that referenced this issue Nov 17, 2023
This is WIP.

I attempted to fix huggingface#1082. While adding tests for the bug, I discovered
that I could not make prompt_tuning_init != RANDOM to work. Maybe I'm
using it wrong, but I'm not sure what to change.
Copy link

github-actions bot commented Dec 6, 2023

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

@evaline-ju
Copy link

On using MPT we have also observed this error. This seems to have been auto-closed due to staleness but wanted to see if this could be reopened since it wasn't resolved.

@BenjaminBossan
Copy link
Member

@evaline-ju There is a PR (#1082) but it's awaiting review. In the meantime, you could manually fix the issue as indicated or install PEFT from the PR branch.

@BenjaminBossan BenjaminBossan reopened this Jan 9, 2024
Copy link

github-actions bot commented Feb 2, 2024

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

BenjaminBossan added a commit that referenced this issue Feb 19, 2024
Resolves #1082.

Also, adding tests for prompt_tuning_init != RANDOM.

---------

Co-authored-by: Mayank Mishra <32954280+mayank31398@users.noreply.github.com>
@BenjaminBossan
Copy link
Member

Should be resolved by #1144.

BenjaminBossan added a commit to BenjaminBossan/peft that referenced this issue Mar 14, 2024
Resolves huggingface#1082.

Also, adding tests for prompt_tuning_init != RANDOM.

---------

Co-authored-by: Mayank Mishra <32954280+mayank31398@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants