-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support **kwargs for PyTorch models. #21453
Comments
@damccorm I'm unable to self-assign. Is it because I don't have write permissions? Is there a GH action that can allow me to self-assign future issues? |
One of the unfortunate downsides of moving to issues is that non-committers can't self-assign/triage issues that they didn't create by default. I think that's a workflow we'll need to support going forward though, I'll look into it. For the moment, I'd recommend commenting/declaring intent to work on any issues. I have access through Line 39 in 4dce7b8
|
Once #21719 is in, you should be able to do this with chat-op commands |
Got it, thanks! |
we decided to move forward with a separate param: #21806 |
Some models in Pytorch instantiating from torch.nn.Module, has extra parameters in the forward function call. These extra parameters can be passed as Dict or as positional arguments.
Example of PyTorch models supported by Hugging Face -> https://huggingface.co/bert-base-uncased
Some torch models on Hugging face
Eg: https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel
Transformers integrated in Pytorch is supported by Hugging Face as well.
Imported from Jira BEAM-14337. Original Jira may contain additional context.
Reported by: Anand Inguva.
Subtask of issue #21435
The text was updated successfully, but these errors were encountered: