This package provides spaCy components and
architectures to use a curated set of transformer models via
curated-transformers
in
spaCy.
- Use pretrained models based on one of the following architectures to
power your spaCy pipeline:
- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa
- All the nice features supported by
spacy-transformers
such as support for Hugging Face Hub, multi-task learning, the extensible config system and out-of-the-box serialization - Deep integration into spaCy, which lays the groundwork for deployment-focused features such as distillation and quantization
- Minimal dependencies
Installing the package from pip will automatically install all dependencies.
pip install spacy-curated-transformers
An example project is provided in the project
directory.
- 📘 Layers and Model Architectures: Power spaCy components with custom neural networks
- 📗
CuratedTransformer
: Pipeline component API reference - 📗 Transformer architectures: Architectures and registered functions
Please use spaCy's issue tracker to report a bug, or open a new thread on the discussion board for any other issue.