en_trf_robertabase_lg-2.2.0
·
1389 commits
to master
since this release
Details: https://spacy.io/models/en#en_trf_robertabase_lg
File checksum:
cf32b4f5dbbd3ac4e2584f1cec77a91ceabc3b452005380b652bfc890de80680
Provides weights and configuration for the pretrained transformer model roberta-base
, published by Facebook. The package uses HuggingFace's transformers
implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.
Feature | Description |
---|---|
Name | en_trf_robertabase_lg |
Version | 2.2.0 |
spaCy | >=2.2.1 |
Model size | 278 MB |
Pipeline | sentencizer , trf_wordpiecer , trf_tok2vec |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | roberta-base (Facebook) |
License | MIT |
Author | Facebook (repackaged by Explosion) |
Requires the spacy-transformers
package to be installed. A CUDA-compatible GPU is advised for reasonable performance.
Installation
pip install spacy
python -m spacy download en_trf_robertabase_lg