This repository demonstrates how to use TensorFlow based SegFormer model [1] in 🤗 transformers
package with Jupyter Notebook and Gradio application which is hosted on 🤗 Spaces.
SegFormer achieves good performance on various high-resolution semantic segmentation datasets along with better efficiency.
One of the objectives of this repository is to allow TensorFlow users train high-quality semantic segmentation models that benefit from higher resolutions.
Since the TensorFlow variant of SegFormer hasn't been included in a transformers
release yet you need to install it from the source:
pip install git+https://github.com/huggingface/transformers
notebooks/TFSegFormer_Inference.ipynb
: Shows how to run inference with a pre-trained semantic segmentation model.notebooks/TFSegFormer_Finetune.ipynb
: Shows how to fine-tune a pre-trained SegFormer model.notebooks/TFSegFormer_ONNX.ipynb
: Shows how to convert TensorFlow based SegFormer model to ONNX format along with their timing comparison.
Visit this link.
[1] SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers; Enze Xie, Wenhai Wang, Zhiding Yu, Anima Anandkumar, Jose M. Alvarez, Ping Luo; https://arxiv.org/abs/2105.15203 (2021).
Thanks to the ML-GDE program (ML Developer Programs team) for providing GCP credits that we used for experimentation.