Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Add support for Spacy transformer models (GPU) #5409

Open
martin-kirilov opened this issue Sep 14, 2021 · 2 comments
Open

Add support for Spacy transformer models (GPU) #5409

martin-kirilov opened this issue Sep 14, 2021 · 2 comments

Comments

@martin-kirilov
Copy link

Spacy 3 introduced new transformer-based models that can be run on GPU. I think it's a good idea to add support for these types of language models in AllenNLP, since now it doesn't work out-of-the-box.

My suggestion is to add an option to specify whether to load the model on CPU or GPU, and also use it accordingly.
This would require changes to both allennlp and allennlp-models, as in some models (e.g. SRL), there should be made some changes to the torch tensor management.

@epwalsh
Copy link
Member

epwalsh commented Sep 20, 2021

Hey @martin-kirilov, this sounds like a good feature to have. Feel free to submit a PR if you get a chance.

@github-actions
Copy link

github-actions bot commented Oct 5, 2021

@epwalsh this is just a friendly ping to make sure you haven't forgotten about this issue 😜

@epwalsh epwalsh removed their assignment Oct 7, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants