Skip to content

de_trf_bertbasecased_lg-2.2.0

Compare
Choose a tag to compare
@explosion-bot explosion-bot released this 08 Oct 13:31
· 1389 commits to master since this release
79eeb72

Downloads

Details: https://spacy.io/models/de#de_trf_bertbasecased_lg

File checksum: f9f27bfd138f5b55b3177bb2d933d1825a107275f599c11797f9c2f5dea048b4

Provides weights and configuration for the pretrained transformer model bert-base-german-cased, published by deepset. The package uses HuggingFace's transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.

Feature Description
Name de_trf_bertbasecased_lg
Version 2.2.0
spaCy >=2.2.1
Model size 386 MB
Pipeline  sentencizer, trf_wordpiecer, trf_tok2vec
Vectors 0 keys, 0 unique vectors (0 dimensions)
Sources bert-base-german-cased (deepset)
License MIT
Author deepset (repackaged by Explosion)

Requires the spacy-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.

Installation

pip install spacy
python -m spacy download de_trf_bertbasecased_lg