Information about Embeddings in the Allen Coreference Model #5551
Replies: 1 comment
-
Copied from https://stackoverflow.com/questions/70987577/information-about-embeddings-in-the-allen-coreference-model:
|
Beta Was this translation helpful? Give feedback.
-
Hi everybody,
I'm an Italian student approaching the NLP world.
First of all I'd like to thank you for the amazing work you've done with the paper " Higher-order Coreference Resolution with Coarse-to-fine Inference".
I am using the model provided by allennlp library and I have two questions for you.
in https://demo.allennlp.org/coreference-resolution it is written that the embedding used is SpanBERT. Is this a BERT embedding trained regardless of the coreference task? I mean, could I possibly use this embedding just as a pretrained model on the english language to embed sentences? (e.g. like https://huggingface.co/facebook/bart-base )
is it possible to modify the code in order to return, along with the coreference prediction, also the aforementioned embeddings of each sentence?
I really hope you can help me.
Meanwhile I thank you in advance for your great availability.
Sincerely,
Emanuele Gusso
Beta Was this translation helpful? Give feedback.
All reactions