A simple example of exporting a transformer model with Python, then loading it into tract to make predictions.
First export the pre-trained transformer model using Python and PyTorch
python export.py
the exported model and tokenizer are saved in ./albert
. Then load the model into tract and make prediction.
cargo run --release
The output for input sentence "Paris is the [MASK] of France" should look like
Result: Some("▁capital")