You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
from transformers import AutoTokenizer, TFAutoModelForSequenceClassification
# Load tokenizer and TensorFlow weights from the Hub
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased")
tf_model = TFAutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased")
# Save to disk
tokenizer.save_pretrained("local-tf-checkpoint")
tf_model.save_pretrained("local-tf-checkpoint")
Use CLI to export to ONNX to see failure: python -m transformers.onnx --model=local-tf-checkpoint onnx/
Use --framework to use successfully: python -m transformers.onnx --model=local-tf-checkpoint --framework=tf onnx/
Expected behavior
Once the model directory has been provided, the export should know that a TF model is being used. There should be no dependency on PyTorch (there is also no PyTorch in this environment). Instead, I get this error: RuntimeError: Cannot export model to ONNX using PyTorch because no PyTorch package was found.
cchan-lm
changed the title
Export TF to ONNX fails with CLI using example from docs
TF to ONNX export fails with CLI using example from docs
Aug 5, 2022
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
System Info
transformers
version: 4.21.1Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
python -m transformers.onnx --model=local-tf-checkpoint onnx/
--framework
to use successfully:python -m transformers.onnx --model=local-tf-checkpoint --framework=tf onnx/
Expected behavior
Once the model directory has been provided, the export should know that a TF model is being used. There should be no dependency on PyTorch (there is also no PyTorch in this environment). Instead, I get this error:
RuntimeError: Cannot export model to ONNX using PyTorch because no PyTorch package was found.
Either
transformers
should be updated or the docs at https://huggingface.co/docs/transformers/serialization should be updated to say that--framework=tf
for TensorFlow models is required.The text was updated successfully, but these errors were encountered: