Skip to content

This repository demonstrates how to use TensorFlow based SegFormer model in 🤗 transformers package.

License

Notifications You must be signed in to change notification settings

deep-diver/segformer-tf-transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Usage of TensorFlow based SegFormer in 🤗 transformers

This repository demonstrates how to use TensorFlow based SegFormer model [1] in 🤗 transformers package with Jupyter Notebook and Gradio application which is hosted on 🤗 Spaces.

SegFormer achieves good performance on various high-resolution semantic segmentation datasets along with better efficiency.

One of the objectives of this repository is to allow TensorFlow users train high-quality semantic segmentation models that benefit from higher resolutions.

Notice

Since the TensorFlow variant of SegFormer hasn't been included in a transformers release yet you need to install it from the source:

pip install git+https://github.com/huggingface/transformers

About the notebooks

Demo on Hugging Face Space

Visit this link.

References

[1] SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers; Enze Xie, Wenhai Wang, Zhiding Yu, Anima Anandkumar, Jose M. Alvarez, Ping Luo; https://arxiv.org/abs/2105.15203 (2021).

Acknowledgements

Thanks to the ML-GDE program (ML Developer Programs team) for providing GCP credits that we used for experimentation.