My links for models and libraries for the NLP downstrem tasks:
BERT from: https://github.com/google-research/bert
BERT pretrained model: https://github.com/google-research/bert#pre-trained-models
PyTorch Pretrained Bert: https://github.com/huggingface/transformers/tree/v0.4.0
RoBERTa : https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/README.md
RoBERTa from huggingface: https://huggingface.co/docs/transformers/model_doc/roberta
KeyBERT from: https://github.com/MaartenGr/KeyBERT
Word2Vec from: https://code.google.com/archive/p/word2vec/
GloVe from: https://nlp.stanford.edu/projects/glove/
YAKE from: https://github.com/LIAAD/yake
spaCy from: https://github.com/explosion/spaCy
scispaCy from: https://github.com/allenai/scispacy
SBERT pretrained model or Sentence-transformers Pretrained Models from: https://www.sbert.net/docs/pretrained_models.html
Third party NLP models in the Elastic Stack: https://www.elastic.co/guide/en/machine-learning/master/ml-nlp-model-ref.html
HuggingFace - The AI community building the future: https://huggingface.co/
HuggingFace -Transfermers: https://huggingface.co/docs/transformers/index
HuggingFace -Transfermers from Github: https://github.com/huggingface/transformers
HuggingFace -Models: https://huggingface.co/models
BertViz - Visualize Attention in NLP Models: https://github.com/jessevig/bertviz
LSTM - Understanding LSTM Networks: http://colah.github.io/posts/2015-08-Understanding-LSTMs/
NLP models from Papers-with-code: https://paperswithcode.com/methods/area/natural-language-processing
Semantic Textual Similarity: https://paperswithcode.com/task/semantic-textual-similarity
Hedwig: This repo contains PyTorch deep learning models for document classification, implemented by the Data Systems Group at the University of Waterloo: https://github.com/castorini/hedwig
Keyword extraction method by combining the Rapid Automatic Keyword Extraction (RAKE) algorithm withthe NLTK toolkit gensim.summarization import keywords: https://github.com/csurfer/rake-nltk
Code to perform training, inference, and evaluation for 9 Topic Modeling packages https://github.com/d2klab/tomodapi