-
Notifications
You must be signed in to change notification settings - Fork 802
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't save bento when using transformers custom pipeline #2534
Comments
Can you explain a bit more about your usecase? Have you tried the new API |
Custom Piepeline
After define custom transformers pipeline, saving pipeline using
Error message:
I use 1.0.0rc0 version! |
Thanks for the code sample. I will get back to you asap |
The current Transformers |
Hi, reniew. [Example for existed code]
Based on the above process, I've made a customized pipeline and made a bento model successfully. |
I don't understand updating |
Hmm, it seems from the transformers package that you have to manually update the |
I opened a ticket upstream huggingface/transformers#17762 since I believe this implementation should be supported from transformers itself. I will try to reach out to the huggingface team and we will see how it pans out. fyi, we simplify the implementation for 1.0 where we only support the pipelines abstraction from transformers now. We did have support for saving models, tokenizer, config, etc. separately before, but that mingles with a lot of transformers internal implementation. We prefer not to coupling such logics into bentoml. Obviously for supporting custom pipeline, it would be best if we save the model and all of the components of your pipeline to bentoml and leave loading your pipeline to you. Me and @ssheng will discuss more. |
I try the other way that saving, loading custom transformers pipeline like below code, not adding new transformers pipeline.
But after loading by
I think it cause by logic to load pipeline in
It seems to be not support custom pipeline class. In this situation, it would be better to use load by torch model not using transformers? |
I think that is provably the case here. Since transformers itself doesnt have good support for custom pipeline, saving pytorch model and mimic the inference processing would be better here. I will follow up with the custom pipeline proposal on huggingface end. |
BentoML now support custom pipelines and this will be included in rc3 releases. Thanks for opening this issue. |
Is your feature request related to a problem? Please describe.
Custom Transformer Pipeline is not available for
bentoml.transformer.save()
, since custom defined pipeline task can't pass task name validator.Describe the solution you'd like
Pre-defined transformers task is too restrictive, it is more useful to applying custom pipeline.
The text was updated successfully, but these errors were encountered: